Title: Topics in Information Theory and Machine Learning
Abstract: Information theory underlies the design of codes. Claude Shannon Shannon, C.E. probably started the field with a seminal article (1948), in which he defined a measure of information: the entropy. In this chapter, we introduce essential concepts in information theory: entropy, optimal coding, cross entropy, and perplexity. Entropy is a very versatile measure of the average information content of symbol sequences and we will explore how it can help us design efficient encodings.
Publication Year: 2014
Publication Date: 2014-01-01
Language: en
Type: book-chapter
Indexed In: ['crossref']
Access and Citation
Cited By Count: 1
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot