#ml
Somebody explain this to me: word embedding vectors not only place words geometrically close to each other based on meaning, but also embed concepts in directionality so that difference vectors are also semantically meaningful across space? Is that right, and how on earth does that happen?