Deep Learning Summer School, Montreal 2016 is aimed at graduate students and industrial engineers and researchers who already have some basic knowledge of machine learning (and possibly but not necessarily of deep learning) and wish to learn more about this rapidly growing field of research. If that is you, there are plenty of videos to help you learn more.
Recently, MIT Technology Review ran an article about the new uses of deep learning at Facebook. Facebook would like to use deep learning to understand more about its users. They have assembled quite a team.
If you are looking to learn more about deep learning, Andrew Ng, cofounder of Coursera, has some course materials on deep learning available on the Stanford Openclassroom site. The materials appear incomplete, but they do provide lectures covering neural networks which are the foundations of deep learning.
Deep learning is set of algorithms in machine learning that attempt to learn layered models of inputs, commonly neural networks. The layers in such models correspond to distinct levels of concepts, where higher-level concepts are defined from lower-level ones, and the same lower-level concepts can help to define many higher-level concepts.
Deep Learning is sometimes referred to as deep neural networks since much of deep learning focuses on artificial neural networks. Artificial neural networks are a technique in computer science modelled after the connections (synapses) of neurons in the brain. Artificial neural networks, sometimes just called neural nets, have been around for about 50 years, but advances in computer processing power and storage are finally allowing neural nets to improve solutions for complex problems such as speech recognition, computer vision, and Natural Language Processing (NLP).
Hopefully, this blog post provides some inspiration and useful links to help you learn more about deep learning.
How is Deep Learning being applied?
The following talk, Tera-scale Deep Learning, by Quoc V. Le of Stanford gives some indication of the size of problems to be tackled. The talk discusses work being done on a cluster of 2000 machines and more than 1,000,000,000 parameters.