logo资料库

Natural Language Processing with PyTorch 1st Edition (2019).pdf

第1页 / 共297页
第2页 / 共297页
第3页 / 共297页
第4页 / 共297页
第5页 / 共297页
第6页 / 共297页
第7页 / 共297页
第8页 / 共297页
资料共297页,剩余部分请下载后查看
Preface
Conventions Used in This Book
Using Code Examples
O’Reilly Safari
How to Contact Us
Acknowledments
1. Introduction
The Supervised Learning Paradigm
Observation and Target Encoding
One-Hot Representation
TF Representation
TF-IDF Representation
Target Encoding
Computational Graphs
PyTorch Basics
Installing PyTorch
Creating Tensors
Tensor Types and Size
Tensor Operations
Indexing, Slicing, and Joining
Tensors and Computational Graphs
CUDA Tensors
Exercises
Solutions
Summary
References
2. A Quick Tour of Traditional NLP
Corpora, Tokens, and Types
Unigrams, Bigrams, Trigrams, …, N-grams
Lemmas and Stems
Categorizing Sentences and Documents
Categorizing Words: POS Tagging
Categorizing Spans: Chunking and Named Entity Recognition
Structure of Sentences
Word Senses and Semantics
Summary
References
3. Foundational Components of Neural Networks
The Perceptron: The Simplest Neural Network
Activation Functions
Sigmoid
Tanh
ReLU
Softmax
Loss Functions
Mean Squared Error Loss
Categorical Cross-Entropy Loss
Binary Cross-Entropy Loss
Diving Deep into Supervised Training
Constructing Toy Data
Putting It Together: Gradient-Based Supervised Learning
Auxiliary Training Concepts
Correctly Measuring Model Performance: Evaluation Metrics
Correctly Measuring Model Performance: Splitting the Dataset
Knowing When to Stop Training
Finding the Right Hyperparameters
Regularization
Example: Classifying Sentiment of Restaurant Reviews
The Yelp Review Dataset
Understanding PyTorch’s Dataset Representation
The Vocabulary, the Vectorizer, and the DataLoader
A Perceptron Classifier
The Training Routine
Evaluation, Inference, and Inspection
Summary
References
4. Feed-Forward Networks for Natural Language Processing
The Multilayer Perceptron
A Simple Example: XOR
Implementing MLPs in PyTorch
Example: Surname Classification with an MLP
The Surnames Dataset
Vocabulary, Vectorizer, and DataLoader
The SurnameClassifier Model
The Training Routine
Model Evaluation and Prediction
Regularizing MLPs: Weight Regularization and Structural Regularization (or Dropout)
Convolutional Neural Networks
CNN Hyperparameters
Implementing CNNs in PyTorch
Example: Classifying Surnames by Using a CNN
The SurnameDataset Class
Vocabulary, Vectorizer, and DataLoader
Reimplementing the SurnameClassifier with Convolutional Networks
The Training Routine
Model Evaluation and Prediction
Miscellaneous Topics in CNNs
Pooling
Batch Normalization (BatchNorm)
Network-in-Network Connections (1x1 Convolutions)
Residual Connections/Residual Block
Summary
References
5. Embedding Words and Types
Why Learn Embeddings?
Efficiency of Embeddings
Approaches to Learning Word Embeddings
The Practical Use of Pretrained Word Embeddings
Example: Learning the Continuous Bag of Words Embeddings
The Frankenstein Dataset
Vocabulary, Vectorizer, and DataLoader
The CBOWClassifier Model
The Training Routine
Model Evaluation and Prediction
Example: Transfer Learning Using Pretrained Embeddings for Document Classification
The AG News Dataset
Vocabulary, Vectorizer, and DataLoader
The NewsClassifier Model
The Training Routine
Model Evaluation and Prediction
Evaluating on the test dataset
Summary
References
6. Sequence Modeling for Natural Language Processing
Introduction to Recurrent Neural Networks
Implementing an Elman RNN
Example: Classifying Surname Nationality Using a Character RNN
The SurnameDataset Class
The Vectorization Data Structures
The SurnameClassifier Model
The Training Routine and Results
Summary
References
7. Intermediate Sequence Modeling for Natural Language Processing
The Problem with Vanilla RNNs (or Elman RNNs)
Gating as a Solution to a Vanilla RNN’s Challenges
Example: A Character RNN for Generating Surnames
The SurnameDataset Class
The Vectorization Data Structures
From the ElmanRNN to the GRU
Model 1: The Unconditioned SurnameGenerationModel
Model 2: The Conditioned SurnameGenerationModel
The Training Routine and Results
Tips and Tricks for Training Sequence Models
References
8. Advanced Sequence Modeling for Natural Language Processing
Sequence-to-Sequence Models, Encoder–Decoder Models, and Conditioned Generation
Capturing More from a Sequence: Bidirectional Recurrent Models
Capturing More from a Sequence: Attention
Attention in Deep Neural Networks
Evaluating Sequence Generation Models
Example: Neural Machine Translation
The Machine Translation Dataset
A Vectorization Pipeline for NMT
Encoding and Decoding in the NMT Model
The Training Routine and Results
Summary
References
9. Classics, Frontiers, and Next Steps
What Have We Learned so Far?
Timeless Topics in NLP
Dialogue and Interactive Systems
Discourse
Information Extraction and Text Mining
Document Analysis and Retrieval
Frontiers in NLP
Design Patterns for Production NLP Systems
Where Next?
References
Index
Natural Language Processing with PyTorch Build Intelligent Language Applications Using Deep Learning Delip Rao and Brian McMahan
Natural Language Processing with PyTorch by Delip Rao and Brian McMahan Copyright © 2019 Delip Rao and Brian McMahan. All rights reserved. Printed in the United States of America. Published by O’Reilly Media, Inc., 1005 Gravenstein Highway North, Sebastopol, CA 95472. O’Reilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles (http://oreilly.com/safari). For more information, contact our corporate/institutional sales department: 800- 998-9938 or corporate@oreilly.com. Acquisition Editor: Rachel Roumeliotis Development Editor: Jeff Bleiel Production Editor: Nan Barber Copyeditor: Octal Publishing, LLC Proofreader: Rachel Head Indexer: Judy McConville Interior Designer: David Futato Cover Designer: Karen Montgomery Illustrator: Rebecca Demarest February 2019: First Edition Revision History for the First Edition 2019-01-16: First Release See http://oreilly.com/catalog/errata.csp?isbn=9781491978238 for release details. The O’Reilly logo is a registered trademark of O’Reilly Media, Inc. Natural Language Processing with PyTorch, the cover image, and related trade dress are trademarks of O’Reilly Media, Inc. The views expressed in this work are those of the authors, and do not represent the publisher’s views. While the publisher and the authors have used good faith efforts to ensure that the information and instructions contained in this work are accurate, the publisher and the authors disclaim all responsibility for errors or
omissions, including without limitation responsibility for damages resulting from the use of or reliance on this work. Use of the information and instructions contained in this work is at your own risk. If any code samples or other technology this work contains or describes is subject to open source licenses or the intellectual property rights of others, it is your responsibility to ensure that your use thereof complies with such licenses and/or rights. 978-1-491-97823-8 [LSI]
Preface This book aims to bring newcomers to natural language processing (NLP) and deep learning to a tasting table covering important topics in both areas. Both of these subject areas are growing exponentially. As it introduces both deep learning and NLP with an emphasis on implementation, this book occupies an important middle ground. While writing the book, we had to make difficult, and sometimes uncomfortable, choices on what material to leave out. For a beginner reader, we hope the book will provide a strong foundation in the basics and a glimpse of what is possible. Machine learning, and deep learning in particular, is an experiential discipline, as opposed to an intellectual science. The generous end-to-end code examples in each chapter invite you to partake in that experience. When we began working on the book, we started with PyTorch 0.2. The examples were revised with each PyTorch update from 0.2 to 0.4. PyTorch 1.0 is due to release around when this book comes out. The code examples in the book are PyTorch 0.4–compliant and should work as they are with the upcoming PyTorch 1.0 release. A note regarding the style of the book. We have intentionally avoided mathematics in most places, not because deep learning math is particularly difficult (it is not), but because it is a distraction in many situations from the main goal of this book—to empower the beginner learner. Likewise, in many cases, both in code and text, we have favored exposition over succinctness. Advanced readers and experienced programmers will likely see ways to tighten up the code and so on, but our choice was to be as explicit as possible so as to reach the broadest of the audience that we want to reach. 1 Conventions Used in This Book The following typographical conventions are used in this book: Italic
Indicates new terms, URLs, email addresses, filenames, and file extensions. Constant width Used for program listings, as well as within paragraphs to refer to program elements such as variable or function names, databases, data types, environment variables, statements, and keywords. Constant width bold Shows commands or other text that should be typed literally by the user. Constant width italic Shows text that should be replaced with user-supplied values or by values determined by context. This element signifies a tip or suggestion. TIP NOTE This element signifies a general note. This element indicates a warning or caution. WARNING Using Code Examples Supplemental material (code examples, exercises, etc.) is available for download at https://nlproc.info/PyTorchNLPBook/repo/. This book is here to help you get your job done. In general, if example code is offered with this book, you may use it in your programs and documentation. You do not need to contact us for permission unless you’re reproducing a significant
portion of the code. For example, writing a program that uses several chunks of code from this book does not require permission. Selling or distributing a CD- ROM of examples from O’Reilly books does require permission. Answering a question by citing this book and quoting example code does not require permission. Incorporating a significant amount of example code from this book into your product’s documentation does require permission. We appreciate, but do not require, attribution. An attribution usually includes the title, author, publisher, and ISBN. For example: “Natural Language Processing with PyTorch by Delip Rao and Brian McMahan (O’Reilly). Copyright 2019, Delip Rao and Brian McMahan, 978-1-491-97823-8.” If you feel your use of code examples falls outside fair use or the permission given above, feel free to contact us at permissions@oreilly.com. O’Reilly Safari Safari (formerly Safari Books Online) is a membership-based training and reference platform for enterprise, government, educators, and individuals. Members have access to thousands of books, training videos, Learning Paths, interactive tutorials, and curated playlists from over 250 publishers, including O’Reilly Media, Harvard Business Review, Prentice Hall Professional, Addison- Wesley Professional, Microsoft Press, Sams, Que, Peachpit Press, Adobe, Focal Press, Cisco Press, John Wiley & Sons, Syngress, Morgan Kaufmann, IBM Redbooks, Packt, Adobe Press, FT Press, Apress, Manning, New Riders, McGraw-Hill, Jones & Bartlett, and Course Technology, among others. For more information, please visit http://oreilly.com/safari. How to Contact Us Please address comments and questions concerning this book to the publisher: O’Reilly Media, Inc. 1005 Gravenstein Highway North
Sebastopol, CA 95472 800-998-9938 (in the United States or Canada) 707-829-0515 (international or local) 707-829-0104 (fax) We have a web page for this book, where we list errata, examples, and any additional information. You can access this page at http://bit.ly/nlprocbk. To comment or ask technical questions about this book, send email to bookquestions@oreilly.com. For more information about our books, courses, conferences, and news, see our website at http://www.oreilly.com. Find us on Facebook: http://facebook.com/oreilly Follow us on Twitter: http://twitter.com/oreillymedia Watch us on YouTube: http://www.youtube.com/oreillymedia Acknowledments This book has gone through an evolution of sorts, with each version of the book looking unlike the version before. Different folks (and even different DL frameworks) were involved in each version. The authors want to thank Goku Mohandas for his initial involvement in the book. Goku brought a lot of energy to the project before he had to leave for work reasons. Goku’s enthusiasm for PyTorch and his positivity are unmatched, and the authors missed his presence. We expect great things coming from him! The book would not be in top technical form if it not for the kind yet high- quality feedback from our technical reviewers, Liling Tan and Debasish Gosh. Liling contributed his expertise in developing products with state-of-the-art NLP, while Debasish gave highly valuable feedback from the perspective of the developer audience. We are also grateful for the encouragement from Alfredo Canziani, Soumith Chintala, and the many other amazing folks on the PyTorch
分享到:
收藏