25.08.2015 Views

In the Beginning was Information

6KezkB

6KezkB

SHOW MORE
SHOW LESS
  • No tags were found...

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

) Expectation value of <strong>the</strong> information content of a symbol: H can alsobe regarded as <strong>the</strong> expectation value of <strong>the</strong> information content of a symbolarriving from a continuously transmitting source.c) The mean decision content per symbol: H can also be regarded as <strong>the</strong>mean decision content of a symbol. It is always possible to encode <strong>the</strong>symbols transmitted by a source of messages into a sequence of binarysymbols (0 and 1). If we regard <strong>the</strong> binary code of one symbol as a binaryword, <strong>the</strong>n H can also be interpreted as follows (note that binary words donot necessarily have <strong>the</strong> same length): It is <strong>the</strong> average word length of <strong>the</strong>code required for <strong>the</strong> source of <strong>the</strong> messages. If, for instance, we wantto encode <strong>the</strong> four letters of <strong>the</strong> genetic code for a computer investigationand <strong>the</strong> storage requirements have to be minimised, <strong>the</strong>n H will be lb 4 = 2binary positions (e. g. 00 = A, 01 = C, 10 = G, and 11 = T).The exceptional case of symbols having equal probabilities: This is animportant case, namely that all N symbols of <strong>the</strong> alphabet or some o<strong>the</strong>rset of elements occur with <strong>the</strong> same probability p(x i ) = 1/N. To find <strong>the</strong>mean information content of a single symbol, we have to divide <strong>the</strong> rightside of equation (8) by n:H ≡ I ave (x) ≡ i = lb N (10)We now formulate this statement as a special <strong>the</strong>orem:Theorem A1: <strong>In</strong> <strong>the</strong> case of symbol sequences of equal probability(e. g. <strong>the</strong> digits generated by a random number generator) <strong>the</strong> averageinformation content of a symbol is equal to <strong>the</strong> information content ofeach and every individual symbol.176

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!