logo资料库

The Inner Workings - of - word2vec :一文搞懂word2vec.pdf

第1页 / 共48页
第2页 / 共48页
第3页 / 共48页
第4页 / 共48页
第5页 / 共48页
第6页 / 共48页
第7页 / 共48页
第8页 / 共48页
资料共48页,剩余部分请下载后查看
Contents
Introduction
1. Word Vectors & Their Applications
1.1. What’s a Word Vector?
1.2. Feature Vectors & Similarity Scores
1.3 Example Code Summary
2. Skip-gram Model Architecture
2.1. The Fake Task
2.2. Model Details
2.3. The Hidden Layer
2.4. The Output Layer
2.5. Intuition
2.6. Next Up
2.7. Example Code Summary
3. Sampling Techniques
3.1. Performance Problems
3.2. Subsampling Frequent Words
3.3. Context Position Weighting
3.4. Negative Sampling
3.5. Example Code Summary
4. Model Variations
4.1. Continuous Bag-of-Words (CBOW)
4.2. Hierarchical Softmax
4.3. Practical Differences
5. Bonus #1 - FAQ
5.1. What are the explanations for the names “Continuous Bag-of-Words” and “Skip-gram”?
5.2. How do you build a vocabulary that includes multi-word names and phrases?
5.3. Does the position of a word within the context window matter in training?
5.4. How can the probabilities sum to one if certain words always appear together?
6. Bonus #2 - Resources
6.1. Original Papers & Code
6.2. Understanding the Math
6.3. Survey of Implementations
The Inner Workings - of - word2vec By Chris McCormick
It is my earnest desire that the information in this book be as correct as possible; however, I cannot make any guarantees. This is an evolving book about an evolving technology in an evolving field--there are going to be mistakes! So here’s my disclaimer: The author does not assume and hereby disclaims any liability to any party for any loss, damage, or disruption caused by errors or omissions, whether such errors or omissions result from negligence, accident, or any other cause. Copyright © 2019 by Chris McCormick All rights reserved. Edition: v1.3.1 1
Contents Introduction 1. Word Vectors & Their Applications 1.1. What’s a Word Vector? 1.2. Feature Vectors & Similarity Scores 1.3 Example Code Summary 2. Skip-gram Model Architecture 2.1. The Fake Task 2.2. Model Details 2.3. The Hidden Layer 2.4. The Output Layer 2.5. Intuition 2.6. Next Up 2.7. Example Code Summary 3. Sampling Techniques 3.1. Performance Problems 3.2. Subsampling Frequent Words 3.3. Context Position Weighting 3.4. Negative Sampling 3.5. Example Code Summary 4. Model Variations 4 7 8 9 11 12 13 16 17 19 20 21 22 23 24 25 28 30 33 33 2
4.1. Continuous Bag-of-Words (CBOW) 4.2. Hierarchical Softmax 4.3. Practical Differences 5. Bonus #1 - FAQ 6. Bonus #2 - Resources 6.1. Original Papers & Code 6.2. Understanding the Math 6.3. Survey of Implementations 33 36 37 39 45 45 46 47 3
Introduction Welcome to my word2vec eBook! Whether you are a student learning important machine learning concepts, a researcher exploring new techniques and ideas, or an engineer with a vision to build a new product or feature, my hope is that the content in this guide will help you gain a deeper understanding of the algorithm, and equip you to realize your own goals faster and with better results. Here is an overview of the content you’ll find in this book. Chapter 1 - Word Vectors & Their Applications ● This chapter will answer the questions, “what is a word vector?” and “how are they useful?” I’ll explain how word vectors can be used to measure how similar two words are in meaning, and the value this has across a number of applications. You may skip this section if you are already familiar with the motivations and uses for word vectors. Chapter 2 - Skip-gram Model Architecture ● After learning why word vectors are valuable, Chapter 2 will address how (both conceptually and in implementation) the word2vec approach is able to learn and encode the meaning of a word. Chapter 3 - Sampling Techniques ● The architecture described in chapter 2 is good in concept but prohibitively expensive in practice. Negative Sampling is a slight modification to the training process which is both dramatically faster and produces higher quality results. 4
Chapter 4 - Model Variations ● For completeness, chapter 4 describes the Continuous Bag-of- Words (CBOW) architecture (an alternative to the skip-gram architecture which was also presented in the original word2vec paper), and Hierarchical Softmax (an alternative to Negative Sampling). Chapter 5 - FAQ ● The FAQ section addresses some common questions (and some common sources of confusion!) around word2vec. Chapter 6 - Resources ● This section points to further helpful resources: 1. The original papers and implementation. 2. Articles which explain the mathematical formulation. 3. A brief survey of some popular implementations of word2vec. Example Code ● There is Python example code to go along with most chapters of this eBook. At the end of each chapter in this book, you’ll find a summary of the corresponding example code. ○ We provide the code as Jupyter Notebooks (which are great for intermixing code with explanations). ○ We also include a read-only HTML version of each Notebooks if you prefer to simply read them or to copy and paste the code to run in your favorite Python environment. 5
Feedback ● I want your feedback! If you have questions, if you spot any mistakes, if you have suggestions for additional content, or if you have any other feedback, please drop a note in the comments here! 6
1. Word Vectors & Their Applications 7
分享到:
收藏