|
Claude Shannon in a famous paper in 1948
defined the information communciated to a receiver as the reduction in the
receivers uncertainty about some possible states of the world. He showed how
to compute the uncertainty (entropy) before and after the communication. The
information communicated is the entropy(uncertainty) before the
communication minus the uncertainty (entropy) afterwards. This definition of
information has become the foundation of a vast theoretical and practical
enterprise in physics and engineering. |