Machine Learning with TensorFlow MEAP v10
Copyright
Welcome
Brief contents
Chapter 1: A machine-learning odyssey
1.1 Machine learning fundamentals
1.2 Data representation and features
1.3 Distance Metrics
1.4 Types of Learning
1.4.1 Supervised Learning
1.4.2 Unsupervised Learning
1.4.3 Reinforcement Learning
1.5 Existing Tools
1.5.1 Theano
1.5.2 Caffe
1.5.3 Torch
1.5.4 Computational Graph Toolkit
1.6 TensorFlow
1.7 Overview of future chapters
1.8 Summary
Chapter 2: TensorFlow essentials
2.1 Ensuring TensorFlow works
2.2 Representing tensors
2.3 Creating operators
2.4 Executing operators with sessions
2.4.1 Understanding code as a graph
2.4.2 Session configurations
2.5 Writing code in Jupyter
2.6 Using variables
2.7 Saving and Loading Variables
2.8 Visualizing data using TensorBoard
2.8.1 Implementing a moving average
2.8.2 Visualizing the moving average
2.9 Summary
Chapter 3: Linear regression and beyond
3.1 Formal notation
3.1.1 How do you know the regression algorithm is working?
3.2 Linear Regression
3.3 Polynomial Model
3.4 Regularization
3.5 Application of linear regression
3.6 Summary
Chapter 4: A gentle introduction to classification
4.1 Formal Notation
4.2 Measuring Performance
4.2.1 Accuracy
4.2.2 Precision and Recall
4.2.3 Receiver operating characteristic curve
4.3 Using linear regression for classification
4.4 Using logistic regression
4.4.1 Solving one-dimensional logistic regression
4.4.2 Solving two-dimensional logistic regression
4.5 Multiclass classifier
4.5.1 One versus all
4.5.2 One versus one
4.5.3 Softmax regression
4.6 Application of classification
4.7 Summary
Chapter 5: Automatically clustering data
5.1 Traversing files in TensorFlow
5.2 Extracting features from audio
5.3 K-means clustering
5.4 Audio segmentation
5.5 Clustering using a self-organizing map
5.6 Application of clustering
5.7 Summary
Chapter 6: Hidden Markov models
6.1 Example of a not-so-interpretable model
6.2 Markov Model
6.3 Hidden Markov Model
6.4 Forward algorithm
6.5 Viterbi decode
6.6 Uses of Hidden Markov Models
6.6.1 Modeling a video
6.6.2 Modeling DNA
6.6.3 Modeling an image
6.7 Application of hidden Markov models
6.8 Summary
Chapter 7: A peek into autoencoders
7.1 Neural Networks
7.2 Autoencoder
7.3 Batch training
7.4 Working with images
7.5 Application of autoencoders
7.6 Summary
Chapter 8: Reinforcement learning
8.1 Formal notions
8.1.1 Policy
8.1.2 Utility
8.2 Applying reinforcement learning
8.3 Implementation
8.4 Applications of reinforcement learning
8.5 Summary
Chapter 9: Convolutional neural networks
9.1 Drawback of neural networks
9.2 Convolutional neural networks
9.3 Preparing the image
9.3.1 Generate filters
9.3.2 Convolve using filters
9.3.3 Max-pooling
9.4 Implementing a convolutional neural network in TensorFlow
9.4.1 Measuring performance
9.4.2 Training the classifier
9.5 Tips and tricks to improve performance
9.6 Application of convolutional neural networks
9.7 Summary
Chapter 10: Recurrent neural networks
10.1 Contextual information
10.2 Introduction to recurrent neural networks
10.3 Implementing a recurrent neural network
10.4 A predictive model for timeseries data
10.5 Application of recurrent neural networks
10.6 Summary
Chapter 11: Sequence-to-sequence models for chatbots
11.1.1 Classification
11.1.2 Recurrent neural networks
11.1.3 Classification and RNNs
11.2 Seq-to-seq architecture
11.3 Vector representation of symbols
11.4 Putting it all together
11.5 Gathering dialogue data
11.6 Summary
Chapter 12: Utility landscape
12.1 Preference model
12.2 Image embedding
12.3 Ranking images
12.4 Summary
12.5 What's next?
Appendix A: Installation
A.1 Installing TensorFlow using Docker
A.1.1 Install Docker on Windows
A.1.2 Install Docker on Linux
A.1.3 Install Docker on OSX
A.1.4 How to user Docker
A.2 Installing Matplotlib