Deep Learning with Data Science Experience

Deep learning is a branch of Machine Learning that uses lots of data to teach computers how to do things only humans were capable of before.

A good example of Deep Learning is perception, recognizing what’s in an image, what people are saying when they are talking, helping robots explore the world and interact with it. Deep learning is emerging as a central tool to solve perception problems in recent years. It’s the state of the art having to do with computer vision and speech recognition. Increasingly people are finding that deep learning is a much better tool to solve problems.

Many companies today have made deep learning a central part of their machine learning toolkit. For example Facebook, Google and Uber are all using deep learning in their products. We at IBM are collaborating with the leaders in the market to push the research forward and lead in that space.

Deep learning shines wherever there is lots of data and complex problems to solve and many companies today are facing lots of complicated problems. Deep learning can be applied to many different fields.

As deep neural networks become increasingly important to everything from self-driving cars to voice recognition, new libraries are making it much easier to use deep learning to solve real problems. Building a training a multi-layer convolutional neural network would have taken hundreds of lines of code just a few years ago. In this post we are going to have an overview of the most popular Open Source projects that are available in the IBM Data Science Experience.

Why Deep Learning now?

One of the fascinating things about neural networks is how long they have taken to be an over night success. The history goes back all the way to the 1950s. Deep learning has really only taken off in the last five years.The reason is the increased availability of label data along with the greatly increased computational throughput of modern processors.

For a long time, we didn’t have the huge label data sets that we needed to make deep learning work. Those data sets only became widely available with the rise of the Internet, which made collecting and labeling huge datasets feasible. But even when we had big datasets, we often didn’t have enough computational power to make us of them and it is only been in the last five years that processors have gotten big enough and fast enough to train large scale neural networks.

How to get started with Deep Learning in Python

There is a fast growing community of researchers, engineers, and data scientists who share a common, very powerful set of tools and most of them are Open Source.

One of the nice things about deep learning is that it’s really a family of techniques that adapts to all sorts of data and all sorts of problems, all using a common infrastructure and a common language to describe things.

The best is start with very simple models and move later to very large ones. It is simple to get started with your own personal computer to do very elaborate tasks. In the IBM Data Science Experience you have everything you need for free to start experimenting with Deep Learning technologies. Find here a summary of the most popular Deep Learning Python libraries and tutorials:

  • Theano: It is a low-level library that specializes in efficient computation. You’ll only use this directly if you need fine-grain customization and flexibility. → Tutorial
  • Tensorflow: It is another low-level library that is less mature than Theano. However, it’s supported by Google and offers out-of-the-box distributed computing. → Tutorial
  • Keras: It is a heavyweight wrapper for both Theano and Tensorflow. It’s minimalistic, modular, and awesome for rapid experimentation. This is our favorite Python library for deep learning and the best place to start for beginners. → Tutorial
  • Lasagne: It is a lightweight wrapper for Theano. Use this if need the flexibility of Theano but don’t want to always write neural network layers from scratch. → Tutorial
  • MXNet- It is another high-level library similar to Keras. It offers bindings for multiple languages and support for distributed computing. → Tutorial


One clap, two clap, three clap, forty?

By clapping more or less, you can signal to us which stories really stand out.