Information theory


Topic history | v1 (current) | created by janarez

Details

Information theory

| created by janarez | Add topic "Information theory"
Title
Information theory
Description
Information theory is the scientific study of the quantification, storage, and communication of digital information. The field was fundamentally established by the works of Harry Nyquist and Ralph Hartley, in the 1920s, and Claude Shannon in the 1940s. The field is at the intersection of probability theory, statistics, computer science, statistical mechanics, information engineering, and electrical engineering. A key measure in information theory is entropy. Entropy quantifies the amount of uncertainty involved in the value of a random variable or the outcome of a random process. For example, identifying the outcome of a fair coin flip (with two equally likely outcomes) provides less information (lower entropy) than specifying the outcome from a roll of a die (with six equally likely outcomes). Some other important measures in information theory are mutual information, channel capacity, error exponents, and relative entropy.
Link
https://en.wikipedia.org/?curid=14773

resources

treated in Elements of Information Theory
v1 | attached by janarez | Add topic "Information theory"
has official A mathematical theory of communication
v1 | attached by janarez | Add topic "Information theory"

authors

This topic has no history of related authors.

topics

important for | used by Data compression
v1 | attached by janarez | Add topic "Information theory"
created by Claude Shannon
v1 | attached by janarez | Add topic "Information theory"