Distributed Representations of Words
Get the word vectors of a word2vec model
Get document vectors based on a word2vec model
Predict functionalities for a word2vec model
Read a binary word2vec model from disk
Read word vectors from a word2vec model from disk
Text cleaning specific for input to word2vec
Train a word2vec model on text
Train a word2vec model on text
Train a word2vec model on text
Similarity between word vectors as used in word2vec
Save a word2vec model to disk
Learn vector representations of words by continuous bag of words and skip-gram implementations of the 'word2vec' algorithm. The techniques are detailed in the paper "Distributed Representations of Words and Phrases and their Compositionality" by Mikolov et al. (2013), available at <arXiv:1310.4546>.