LICENSE
Deep Learning Tutorials
Getting Started
Download
Datasets
Notation
A Primer on Supervised Optimization for Deep Learning
Theano/Python Tips
Classifying MNIST digits using Logistic Regression
The Model
Defining a Loss Function
Creating a LogisticRegression class
Learning the Model
Testing the model
Putting it All Together
Multilayer Perceptron
The Model
Going from logistic regression to MLP
Putting it All Together
Tips and Tricks for training MLPs
Convolutional Neural Networks (LeNet)
Motivation
Sparse Connectivity
Shared Weights
Details and Notation
The Convolution Operator
MaxPooling
The Full Model: LeNet
Putting it All Together
Running the Code
Tips and Tricks
Denoising Autoencoders (dA)
Autoencoders
Denoising Autoencoders
Putting it All Together
Running the Code
Stacked Denoising Autoencoders (SdA)
Stacked Autoencoders
Putting it all together
Running the Code
Tips and Tricks
Restricted Boltzmann Machines (RBM)
Energy-Based Models (EBM)
Restricted Boltzmann Machines (RBM)
Sampling in an RBM
Implementation
Results
Deep Belief Networks
Deep Belief Networks
Justifying Greedy-Layer Wise Pre-Training
Implementation
Putting it all together
Running the Code
Tips and Tricks
Hybrid Monte-Carlo Sampling
Theory
Implementing HMC Using Theano
Testing our Sampler
References
Recurrent Neural Networks with Word Embeddings
Summary
Code - Citations - Contact
Task
Dataset
Recurrent Neural Network Model
Evaluation
Training
Running the Code
LSTM Networks for Sentiment Analysis
Summary
Data
Model
Code - Citations - Contact
References
Modeling and generating sequences of polyphonic music with the RNN-RBM
The RNN-RBM
Implementation
Results
How to improve this code
Miscellaneous
Plotting Samples and Filters
References
Bibliography
Index