25.08.2015 Views

In the Beginning was Information

6KezkB

6KezkB

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

of uncertainty which will be resolved when <strong>the</strong> next symbol arrives. When<strong>the</strong> next symbol is a “surprise”, it is accorded a greater information valuethan when it is expected with a definite “certainty”. The reader who isma<strong>the</strong>matically inclined, may be interested in <strong>the</strong> derivation of some ofShannon’s basic formulas; this may contribute to a better understanding ofhis line of reasoning.1. The information content of a sequence of symbols: Shannon <strong>was</strong>only interested in <strong>the</strong> probability of <strong>the</strong> appearance of <strong>the</strong> various symbols,as should now become clearer. He thus only concerned himself with<strong>the</strong> statistical dimension of information, and reduces <strong>the</strong> information conceptto something without any meaning. If one assumes that <strong>the</strong> probabilityof <strong>the</strong> appearance of <strong>the</strong> various symbols is independent of one ano<strong>the</strong>r(e. g. “q” is not necessarily followed by “u”) and that all N symbols havean equal probability of appearing, <strong>the</strong>n we have: The probability of anychosen symbol x i arriving is given by p i = 1/N. <strong>In</strong>formation content is <strong>the</strong>ndefined by Shannon in such a way that three conditions have to be met:i) If <strong>the</strong>re are k independent messages 21 (symbols or sequences of symbols),<strong>the</strong>n <strong>the</strong> total information content is given by I tot = I 1 + I 2 +…+ I k .This summation condition regards information as quantifiable.ii) The information content ascribed to a message increases when <strong>the</strong> elementof surprise is greater. The surprise effect of <strong>the</strong> seldom used “z”(low probability) is greater than for “e” which appears more frequently(high probability). It follows that <strong>the</strong> information value of a symbol x iincreases when its probability p i decreases. This is expressed ma<strong>the</strong>maticallyas an inverse proportion: I ~ 1/p i .iii)<strong>In</strong> <strong>the</strong> simplest symmetrical case where <strong>the</strong>re are only two differentsymbols (e. g. “0” and “1”) which occur equally frequently (p 1 = 0.5and p 2 = 0.5), <strong>the</strong> information content I of such a symbol will be exactlyone bit.According to <strong>the</strong> laws of probability <strong>the</strong> probability of two independentevents (e. g. throwing two dice) is equal to <strong>the</strong> product of <strong>the</strong> single probabilities:p = p 1 x p 2 (1)21 Message: <strong>In</strong> Shannon’s <strong>the</strong>ory a message is not necessarily meaningful, but it refers toa symbol (e. g. a letter) or a sequence of symbols (e. g. a word). <strong>In</strong> this sense <strong>the</strong> conceptof a “message” is even included in <strong>the</strong> DIN standards system, where it is encodedas 44 300: “Symbols and continuous functions employed for <strong>the</strong> purpose of transmission,which represent information according to known or supposed conventions.”171

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!