This is a bare-bones implementation of the word2vec algorithm proposed by Mikolov et al. 2014 without any optimization techniques (such as Hierarchial Softmax or Negative Sampling). It is implemented simply using Numpy with both the forward and the backward steps done manually. The original paper suggested both a Continuous-Bag-of-Words (CBOW) and a Skip-Gram model for embedding word vectors. This is an implementation of the latter.
-
Notifications
You must be signed in to change notification settings - Fork 0
andreashhpetersen/word2vec
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published