25.08.2021 Views

082-Engineering-Mathematics-Anthony-Croft-Robert-Davison-Martin-Hargreaves-James-Flint-Edisi-5-2017

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

916 Chapter 28 Probability

Engineeringapplication28.6

Informationcontentofabinarydatastream

Suppose that a computer generates a binary stream of data and that 1s and 0s occur

with equal probability, that is P(0) = P(1) = 0.5. Calculate the information per

binary digitgenerated.

Solution

Here, p = 0.5 whether the binary digitis0or 1,so

I = −log 2

(0.5) = −log 10 (0.5)

log 10

2

=1bit

Engineeringapplication28.7

Informationcontentofanalphabeticcharacterstream

Supposethatasystemgeneratesastreamofuppercasealphabeticcharactersandthat

the probability ofacharacteroccurring isthe same forall characters. Calculate

(a) the informationassociatedwith the character Goccurring

(b) the informationassociatedwith any singlecharacter.

Solution

(a) P(Goccurring) = 1

( ) 1

26 , I=−log 2

26

(b) Allcharacters areequally likely tooccur and so

( ) 1

I=−log 2

= −log 10 (1/26) = 4.70 bits

26 log 10

2

= −log 10 (1/26)

log 10

2

= 4.70 bits

Often a series of events may occur that do not have the same probability. For example,

ifastreamofalphabeticcharactersisbeinggeneratedthenitislikelythatsomecharacters

will occur more frequently than others and so have a higher probability associated

with them. For this situation it becomes convenient to introduce the concept of average

information. Given a sourceproducingasetofevents

E 1

,E 2

,E 3

,...,E n

with probabilities

p 1

,p 2

,p 3

,...,p n

thenforalong series ofevents theaverage informationper event isgiven by

∑i=n

H = − p i

log 2

p i

bits

i=1

H isalsotermedtheentropy.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!