Saturday, March 2, 2013

Claude E. Shannon

The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages. The system must be designed to operate for each possible selection, not just the one which will actually be chosen since this is unknown at the time of design.
informationBy a communication system we will mean a system of the type indicated schematically in Fig. 1.  It consists of essentially five parts:
  1. An information source which produces a message or sequence of messages (letters, functions of time and space, etc.) to be communicated to the receiving terminal.
  2. A transmitter which operates on the message in some way (change, encode, sample, compress, quantize, interleave, construct, etc.) to produce a signal suitable for transmission.
  3. The channel is merely the medium used to transmit the signal from transmitter to receiver (a pair of wires, a coaxial cable, a band of radio frequencies, a beam of light, etc.).
  4. The receiver ordinarily performs the inverse operation of that done by the transmitter, reconstructing the message from the signal.
  5. The destination is the person (or thing) for whom the message is intended.
The full article

3 comments:

  1. A Mathematical Theory of Communication

    by C. E. Shannon

    http://cm.bell-labs.com/cm/ms/what/shannonday/shannon1948.pdf

    (Reprinted with corrections from The Bell System Technical Journal, Vol. 27, pp. 379–423, 623–656, July, October, 1948.)

    ReplyDelete
  2. In 1948, Shannon wrote the following:

    If the number of messages in the set is finite then this number or any monotonic function of this number can be regarded as a measure of the information produced when one message is chosen from the set, all choices being equally likely.

    In the case of a discrete source of information we were able to determine a definite rate of generating information, namely the entropy of the underlying stochastic process. With a continuous source the situation is considerably more involved. In the first place a continuously variable quantity can assume an infinite number of values and requires, therefore, an infinite number of binary digits for exact specification. This means that to transmit the output of a continuous source with exact recovery at the receiving point requires, in general, a channel of infinite capacity (in bits per second). Since, ordinarily, channels have a certain amount of noise, and therefore a finite capacity, exact transmission is impossible.

    ReplyDelete
  3. The Information

    by James Gleick

    http://www.kushima.org/is/?p=1921

    http://www.kushima.org/is/?p=1553

    ReplyDelete