25.08.2015 Views

In the Beginning was Information

6KezkB

6KezkB

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

A1 The Statistical View of <strong>In</strong>formationA1.1 Shannon’s Theory of <strong>In</strong>formationClaude E. Shannon (born 1916), in his well-known book “A ma<strong>the</strong>matical<strong>the</strong>ory of communications” [S7, 1948], <strong>was</strong> <strong>the</strong> first person to formulate ama<strong>the</strong>matical definition of information. His measure of information, <strong>the</strong>“bit” (binary digit), had <strong>the</strong> advantage that quantitative properties ofstrings of symbols could be formulated. The disadvantage is just as plain:Shannon’s definition of information entails only one minor aspect of <strong>the</strong>nature of information as we will discuss at length. The only value of thisspecial aspect is for purposes of transmission and storage. The questionsof meaning, comprehensibility, correctness, worth or worthlessness arenot considered at all. The important questions about <strong>the</strong> origin (sender)and for whom it is intended (recipient) are also ignored. For Shannon’sconcept of information it is completely immaterial whe<strong>the</strong>r a sequence ofsymbols represents an extremely important and meaningful text, orwhe<strong>the</strong>r it <strong>was</strong> produced by a random process. It may sound paradoxical,but in this <strong>the</strong>ory a random sequence of symbols represents <strong>the</strong> maximumvalue of information content – <strong>the</strong> corresponding value or number for ameaningful text of <strong>the</strong> same length is smaller.Shannon’s concept: His definition of information is based on a communicationsproblem, namely to determine <strong>the</strong> optimal transmission speed.For technical purposes <strong>the</strong> meaning and import of a message are of noconcern, so that <strong>the</strong>se aspects were not considered. Shannon restrictedhimself to information that expressed something new, so that, briefly,information content = measure of newness, where “newness” does notrefer to a new idea, a new thought, or fresh news – which would haveencompassed an aspect of meaning. But it only concerns <strong>the</strong> surpriseeffect produced by a rarely occurring symbol. Shannon regards a messageas information only if it cannot be completely ascertained beforehand, sothat information is a measure of <strong>the</strong> unlikeliness of an event. An extremelyunlikely message is thus accorded a high information content. Thenews that a certain person out of two million participants has drawn <strong>the</strong>winning ticket, is for him more “meaningful” than if every tenth personstood a chance, because <strong>the</strong> first event is much more improbable.Before a discrete source of symbols (N.B: not an information source!)delivers one symbol (Figure 31), <strong>the</strong>re is a certain doubt as to which onesymbol a i of <strong>the</strong> available set of symbols (e. g. an alphabet with N lettersa 1 , a 2 , a 3 ,..., a N ) it will be. After it has been delivered, <strong>the</strong> previous uncertaintyis resolved. Shannon’s method can thus be formulated as <strong>the</strong> degree170

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!