Abstract: We study the continuity property of the generalized entropy as a functional of probability distribution, defined with an action space and a loss function. Upper and lower bounds for the entropy difference of two distributions are derived in terms of several commonly used f-divergences, the Wasserstein distance, and a distance that depends on the action space and the loss function. Among these bounds, we notice a connection between the continuity of Shannon/differential entropy in the distribution in KL divergence and the continuity of the Rényi entropy in the entropy order. For information-theoretic applications, we derive several new mutual information upper bounds based on the entropy difference bounds. The general results may find broader applications in estimation theory, statistical learning theory, and control theory.
Publication Year: 2020
Publication Date: 2020-06-01
Language: en
Type: article
Indexed In: ['crossref']
Access and Citation
Cited By Count: 3
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot