#ml
Somebody explain this to me: word embedding vectors not only place words geometrically close to each other based on meaning, but also embed concepts in directionality so that difference vectors are also semantically meaningful across space? Is that right, and how on earth does that happen?
[Attenborough voice] To disguise themselves, the young zoomers douse themselves in the sugar glazings of a Krispy Kreme. This confuses and dismays the jaguar