Word Embeddings and Representations
Techniques for representing words and contexts in vector spaces.
Chapter 14: Static and Contextual Embeddings Static: Word2Vec (CBOW, Skip-gram), GloVe, FastText Contextual: ELMo, ULMFiT [Negative sampling, hierarchical softmax, transfer learning] References Chapter 15: Embedding Evaluation Intrinsic: Word similarity, analogy tasks Extrinsic: Downstream NLP performance [Cosine similarity, Spearman correlation, task-specific benchmarks] References