Here are some great resources to kickstart your deep learning.
Geoffrey Hinton recently did a Reddit AMA (Ask Me Anything). It contains some good information related to neural networks and deep learning.
Also, below is a lengthy video of a distinguished lecture Geoffrey gave at the Toyota Technological Institute in Chicago.
Deep Learning is the hottest topic in all of data science right now. Adam Gibson, cofounder of Blix.io, has created an open source deep learning library for Java named DeepLearning4j. For those curious, DeepLearning4j is open sourced on github.
Below is a video of Adam introducing deep learning and DeepLearning4j. Also, if you are interested in learning more about deep learning. Here are a couple more very help links.
Chapter 1 of Michael Nielsen’s Deep Learning Book is available online. The chapter provides
an introduction to neural networks.
When completed, the book will be completely free and open-source. You are welcome to contribute to the fundraising efforts for the book.
Recently, MIT Technology Review ran an article about the new uses of deep learning at Facebook. Facebook would like to use deep learning to understand more about its users. They have assembled quite a team.
If you are looking to learn more about deep learning, Andrew Ng, cofounder of Coursera, has some course materials on deep learning available on the Stanford Openclassroom site. The materials appear incomplete, but they do provide lectures covering neural networks which are the foundations of deep learning.
Deep Learning is a new term that is starting to appear in the data science/machine learning news.
What is Deep Learning?
According to DeepLearning.net, the definition goes like this:
Deep Learning is a new area of Machine Learning research, which has been introduced with the objective of moving Machine Learning closer to one of its original goals: Artificial Intelligence.
Wikipedia provides the following defintion:
Deep learning is set of algorithms in machine learning that attempt to learn layered models of inputs, commonly neural networks. The layers in such models correspond to distinct levels of concepts, where higher-level concepts are defined from lower-level ones, and the same lower-level concepts can help to define many higher-level concepts.
Deep Learning is sometimes referred to as deep neural networks since much of deep learning focuses on artificial neural networks. Artificial neural networks are a technique in computer science modelled after the connections (synapses) of neurons in the brain. Artificial neural networks, sometimes just called neural nets, have been around for about 50 years, but advances in computer processing power and storage are finally allowing neural nets to improve solutions for complex problems such as speech recognition, computer vision, and Natural Language Processing (NLP).
Hopefully, this blog post provides some inspiration and useful links to help you learn more about deep learning.
How is Deep Learning being applied?
The following talk, Tera-scale Deep Learning, by Quoc V. Le of Stanford gives some indication of the size of problems to be tackled. The talk discusses work being done on a cluster of 2000 machines and more than 1,000,000,000 parameters.