Deep Learning

0
107

Is deep learning a buzzword? While we have mentioned deep learning on Cloudy before, we have never taken the time to understand what it is and why it’s more than just a buzzword. This week we belabor that point.

Inspiration

A couple of weeks ago I was at an event where someone referred to deep learning as a “buzzword.” I was surprised by this remark. Deep learning has driven tons of innovation in the AI space in the last 10 years and while people may over utilize the term, it is anything but a buzzword.

But first, what is it

Deep learning is a class of machine learning algorithms. You can think of it like this:

Source: Deep Learning with Python

What happens in deep learning is input data is passed through layers, where each layer transforms data from the subsequent layer to a more “purified” form. Think of data going through an assembly line. At the final layer, the computer has manipulated the original data input in such a way that it can make a prediction about what that input is. The process looks like this:

Source: Deep Learning with Python

These layers are almost always implemented via neural networks. Neural networks are a type of machine learning algorithm that draws inspiration from our understanding of how neurons work in the brain. Keyword here is “understanding” because we don’t really know how the brain works.

The “deep” in deep learning comes from the number of layers the algorithm is programmed to use. If you use few layers, it is generally called “shallow learning.” There isn’t a specific number where shallow learning becomes deep learning, but let’s just say its 10 layers.

When people talk about deep learning, they are probably talking about creating a neural network with many layers, thus deep learning and “deep neural networks” are often used interchangeably. I am ok with that, but when people start to use AI and deep learning interchangeably, I am not ok with that.

Anyways

In the early 2010s, people started to realize that if you make neural networks “deeper” they become more and more accurate. That started, well, an interesting trend where everyone started using deep neural networks.

And when I say everyone…

Siri is powered by a deep neural network. There is a voice recognition algorithm which is powered by neural networks, to access Siri. Animojis are powered by neural networks. Faceid is powered by a neural network. Apple built a custom chip to, yes, you guessed it, power neural networks.

Alexa is a deep neural network. Google uses deep neural networks to predict heart disease, power Google Translate, and pretty much everything else. IBM Watson is probably deep learning. Self driving cars are deep neural networks.

Probably all voice, text, or image APIs built by anyone, are based on deep learning.

UBS estimated that 40% of all AI servers shipped are used for deep learning purposes and that is expected to grow to 70% in 2021.

In a recent study by McKinsey, they found that of machine learning case studies they surveyed “In 69 percent of the use cases we studied, deep neural networks can be used to improve performance beyond that provided by other analytic techniques.” They also said that “feed forward neural networks and convolutional neural networks—together have the potential to create between $3.5 trillion and $5.8 trillion in value annually across nine business functions in 19 industries.”

Deep learning fueled the first wave of acquihires of the early 2010s, with companies like DeepMind, DNNResearch and Maluuba selling for many, many millions of dollars.

The second most popular Coursera course was “Neural Networks and Deep Learning.”

Deep neural networks dominate the conference circuit. Last year, deep neural networks were featured in the second most amount of papers at NIPS (~20%). Just for context, the most talked about thing at NIPS was “algorithms”, at 32%. At the upcoming ICLR 2018, deep learning consists of about 30% of the invited talks.

Deep learning is going to the edge. Qualcomm developed its mobile Snapdragon chip for running neural networks, Apple developed its bionic chip, and Google released TensorFlow Lite, which supports supports hardware acceleration with the Android Neural Networks API.

Sooo, you are saying companies are using deep learning?

Yes. Yes, I am.

A Fork in the road

The biggest question is will deep learning continue to be the driving force for AI in the next 10, 20, or 30 years? If deep learning can become the algorithm of choice in only ten years time, if researchers come up with entirely new methods that are much more efficient and productive, it wouldn’t surprise me if everyone made a pivot.

The pivot would not surprise me because deep learning invokes a whole host of problems. Specialized deep learning servers can cost $20k and up. Deep learning takes an enormous amount of data, compute power and skill to power them. While off the shelf tools exist, you need very specialized knowledge to know what you are doing to implement deep learning. Deep learning is also responsible for much of the “black box” talk as we really don’t understand how deep learning arrives at the answer it does, it just kind of tells us the answer.

Despite this, some companies are betting heavily that the deep learning revolution will keep going. Apple, Qualcomm, IBM and Intel are all making specialized chips to power deep learning. Other companies such as Nvidia and Xilinx are focusing on more general solutions that are not specialized for deep learning, but are good at doing general “AI” calculations. It’s really anyone’s guess what happens next.

At the end of the day

While deep learning is everywhere, it still just takes an input and maps it to an output, just like every other algorithm out there. In some areas it works well, in others it doesn’t. While I believe it has surpassed “buzzword” status, Francois Chollet, author of Deep Learning for Python, sums it up well:

“the only real success of deep learning so far has been the ability to map space X to space Y using a continuous geometric transform, given large amounts of human-annotated data. Doing this well is a game-changer for essentially every industry, but it is still a very long way from human-level AI.”

No magic, just math

Facebook Comments