Word2Vec: Why Do We Need Word Representations?

9 min readMay 17

Word2Vec, short for “word to vector,” is a technology used to represent the relationships between different words in the form of a graph. This technology is widely used in machine learning for embedding and text analysis.

Google introduced Word2Vec for their search engine and patented the algorithm, along with several following updates, in 2013. This collection of interconnected…


Serokell is a software development company focused on building innovative solutions for complex problems. Come visit us at serokell.io!