Embeddings encode semantic relationships: "king - man + woman = queen". Analogies can be solved through simple vector addition and subtraction.
Embeddings & Tokens (4/5) – the geometric interpretation of meaning.
Analogies prove that embeddings learn real semantics – not just word frequencies. This is the foundation for transfer learning and zero-shot generalization.