Title: Review on Word2Vec Word Embedding Neural Net
Abstract: The word2vec model consists of more useful applications in different NLP tasks. The semantic meaning given by word2vec for each word in vector representations has served useful task in machine learning text classification. They are employed in finding analogy, syntactic, and semantic analysis of words. Word2vec falls in two flavors CBOW and Skip-Gram. Given a context, they used to predict a word and vice versa are also true. In order to optimize the efficiency of word2vec, they have introduced two computational techniques namely hierarchical softmax and negative sampling. The proposed research work is more focused on introducing the models, computational technique, and various fields of word2vec applications. Word2vec is compared based on the metrics and their performance is evaluated by comparing with other existing models.
Publication Year: 2020
Publication Date: 2020-09-01
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 35
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot