Title: Word-Sense Disambiguation using maximum entropy model
Abstract:Natural languages are typically replete with homographs, words which have more than one meaning. Consequently, machine understanding of natural language sentences sometimes suffers from certain ambigu...Natural languages are typically replete with homographs, words which have more than one meaning. Consequently, machine understanding of natural language sentences sometimes suffers from certain ambiguities in getting the correct sense of a word in a given sentence. In this work we present a trainable model for word sense disambiguation (WSD) for resolving this ambiguity. The proposed model applies concepts of information theory to find the appropriate sense of a word when the context is known. Given a training text tagged with the correct senses of a particular word, our model learns to classify each occurrence of the target word with its correct sense in the unseen text.Read More
Publication Year: 2009
Publication Date: 2009-12-01
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 5
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot
Title: $Word-Sense Disambiguation using maximum entropy model
Abstract: Natural languages are typically replete with homographs, words which have more than one meaning. Consequently, machine understanding of natural language sentences sometimes suffers from certain ambiguities in getting the correct sense of a word in a given sentence. In this work we present a trainable model for word sense disambiguation (WSD) for resolving this ambiguity. The proposed model applies concepts of information theory to find the appropriate sense of a word when the context is known. Given a training text tagged with the correct senses of a particular word, our model learns to classify each occurrence of the target word with its correct sense in the unseen text.