25.08.2021 Views

082-Engineering-Mathematics-Anthony-Croft-Robert-Davison-Martin-Hargreaves-James-Flint-Edisi-5-2017

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

28.5 Concepts from communication theory 917

Engineeringapplication28.8

Entropyofasignalconsistingofthreecharacters

Asourceproducesmessagesconsistingofthreecharacters,A,BandC.Theprobabilities

of each of these characters occurring isP(A) = 0.2,P(B) = 0.5,P(C) = 0.3.

Calculate the entropy of the signal.

Solution

H = −0.2log 2

(0.2) −0.5log 2

(0.5) −0.3log 2

(0.3) = 1.49 bits

Engineeringapplication28.9

Entropyofabinarydatastream

Asourcegeneratesbinarydigits0,1,withprobabilitiesP(0) = 0.3andP(1) = 0.7.

Calculate the entropy of the signal.

Solution

H = −0.3log 2

(0.3) −0.7log 2

(0.7) = 0.881 bits

Note that in Engineering application 28.9, on average, each binary digit only carries

0.881bitsofinformation.Infactthemaximumaverageamountofinformationthatcan

be carried by a binary digit occurs whenP(0) = P(1) = 0.5, as seen in Engineering

application 28.6. For this caseH = 1.

Foranydatastream,themaximumaverageamountofinformationthatcanbecarried

by a digit occurs when all digits have equal probability, that isH is maximized when

p 1

=p 2

=p 3

=···=p n

.

H is maximized when p 1

= p 2

= p 3

= ··· = p n

. The maximum value of H is

denotedH max

.

When the probabilities are not the same then one way of viewing the reduction in

H is to think of the likely event being given too much of the signalling time given its

lower information content. It is interesting to explore the two limiting cases, that is (a)

P(0) = 0,P(1) = 1;(b)P(0) = 1,P(1) = 0.InbothcasesitcanbeshownthatH = 0.

However,onexaminationthisisreasonablebecauseacontinuousstreamof1sdoesnot

relayanyusefulinformationtotherecipientandneitherdoesacontinuousstreamof0s.

The fact that some streams of symbols do not contain as much information as other

streams of the same symbols leads to the concept of redundancy. This allows the efficiency

with which information isbeing sent tobe quantified and isdefined as

redundancy =

maximum entropy -- actual entropy

maximum entropy

Alowvalue of redundancy corresponds toefficient transmission ofinformation.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!