Yoshua Bengio, Ian Goodfellow and Aaron Courville are writing a deep learning book for MIT Press. The book is not yet complete, but the drafts of the chapters are all available online. The authors are also collecting comments about the chapters before the book goes to press.
The book is broken into 3 sections:
- Math and Machine Learning Fundamentals
- Modern Deep Neural Networks
- Current Research in Deep Learning
The book is very technical and probably suitable for a graduate level course. However, if you have the time and interest, resources such as this are highly valuable.
The National Institute of Standards and Technology (NIST) is attempting to create standards for Big Data. They just released the NIST Big Data interoperability framework, which is a huge set of documents aimed at creating standards around everything in big data from definitions to architectures.
Big Data Definitions
In case you are wondering, and I know you are, what are the definitions. The framework includes many more definitions.
Big Data consists of extensive datasets – primarily in the characteristics of volume, variety, velocity, and/or variability – that require a scalable architecture for efficient storage, manipulation, and analysis.
Data science is the empirical synthesis of actionable knowledge from raw data through the complete data lifecycle process.
Don’t like the definitions? Great, NIST would love to hear your opinions/comments. Comments are being collected until May 21, 2015.
The NIST Big Data interoperability framework is a massive work consisting of 7 volumes. All are open for comments.
- Use Case & Requirements
- Security and Privacy
- Architectures White Paper Survey
- Reference Architecture
- Standards Roadmap
The process to submit a comment appears rather old-school (hint: NIST, Github might be a good place to collect comments/edits), but it is not difficult.