Elements of Information Theory
Resource history | v1 (current) | created by semantic-scholar-bot
Details
Elements of Information Theory
see v1 | created by semantic-scholar-bot | Crawl Semantic Scholar Open Research Corpus
- Title
- Elements of Information Theory
- Type
- Paper
- Created
- 1991-01-01
- Description
- Preface to the Second Edition. Preface to the First Edition. Acknowledgments for the Second Edition. Acknowledgments for the First Edition. 1. Introduction and Preview. 1.1 Preview of the Book. 2. Entropy, Relative Entropy, and Mutual Information. 2.1 Entropy. 2.2 Joint Entropy and Conditional Entropy. 2.3 Relative Entropy and Mutual Information. 2.4 Relationship Between Entropy and Mutual Information. 2.5 Chain Rules for Entropy, Relative Entropy, and Mutual Information. 2.6 Jensen's Inequality and Its Consequences. 2.7 Log Sum Inequality and Its Applications. 2.8 Data-Processing Inequality. 2.9 Sufficient Statistics. 2.10 Fano's Inequality. Summary. Problems. Historical Notes. 3. Asymptotic Equipartition Property. 3.1 Asymptotic Equipartition Property Theorem. 3.2 Consequences of the AEP: Data Compression. 3.3 High-Probability Sets and the Typical Set. Summary. Problems. Historical Notes. 4. Entropy Rates of a Stochastic Process. 4.1 Markov Chains. 4.2 Entropy Rate. 4.
- Link
- https://semanticscholar.org/paper/7dbdb4209626fd92d2436a058663206216036e68
- Identifier
- DOI: 10.1002/0471200611
authors
created by Thomas M. Cover
created by Joy A. Thomas
topics
resources
This resource has no history of related resources.