Title: Context Representation with Word Embeddings for WSD
Abstract: Word embeddings obtained through neural language models developed recently can capture semantic and grammatical behaviors of words and very capably find relationships between words. Such word embeddings are shown to be effective for various NLP tasks. In this paper, we develop a supervised method for word sense disambiguation (WSD) that employs word embeddings as local context features. Our experiments show the usefulness of word embeddings in the WSD task. We also compare the methods with different vector representations and reveal their effects on the WSD task.
Publication Year: 2016
Publication Date: 2016-01-01
Language: en
Type: book-chapter
Indexed In: ['crossref']
Access and Citation
Cited By Count: 20
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot