Title: Representation of Mutual Information Via Input Estimates
Abstract:A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum mean-square error. This pape...A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum mean-square error. This paper generalizes the link between information theory and estimation theory to arbitrary channels, giving representations of the derivative of mutual information as a function of the conditional marginal input distributions given the outputs. We illustrate the use of this representation in the efficient numerical computation of the mutual information achieved by inputs such as specific codes or natural languageRead More
Publication Year: 2007
Publication Date: 2007-01-23
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 74
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot