Wordpress

Jetpack

Tuesday, 25 August 2015

Information Theory & Coding

Information Theory (I.T.) was born in the context of the statistical theory of communications. Its methods, mainly
mathematical, are useful to evaluate the performance of a digital communications system. It deals with the more
fundamental aspects of the communications systems, from only probabilistic models of the physical sources or
channels.
- 1928, Hartley: first attempt to scientific definition of a quantitative measure of information.
Claude Elwood ShannonThe father of information theory
- 1948 Claude Shannon: Introduces the new concept of "quantitative measure of information " from a mathematical way, and deduces the main consequences about fundamental limits on compressing and reliably. SO, he is called father of Information theory.
communicating data: real beginning of the " information theory".
Initial paper: Claude E. Shannon, A Mathematical Theory of Communication.

In Shannon's revolutionary and groundbreaking paper, the work for which had been substantially completed at Bell Labs by the end of 1944,Shannon for the first time introduced the qualitative and quantitative model of communication as a statistical process underlying information theory, opening with the assertion that

"The fundamental problem of communication is that of reproducing at one point, either exactly or approximately, a message selected at another point."
With it came the ideas of

1. The information entropy and redundancy of a source, and its relevance through the source coding theorem.

2. The mutual information, and the channel capacity of a noisy channel, including the promise of perfect loss-free communication given by the noisy-channel coding theorem.

3. The practical result of the Shannon–Hartley law for the channel capacity of a Gaussian channel
4. The bit a new way of seeing the most fundamental unit of information.


For More Study Click on Download Link

        Download

No comments:

Post a Comment