13.07.2015 Views

M. A. K. Baig and Rayees Ahmad Dar - anubih

M. A. K. Baig and Rayees Ahmad Dar - anubih

M. A. K. Baig and Rayees Ahmad Dar - anubih

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

SARAJEVO JOURNAL OF MATHEMATICSVol.3 (15) (2007), 137–143SOME NOISELESS CODING THEOREMS OFINACCURACY MEASURE OF ORDER α AND TYPE βM. A. K. BAIG AND RAYEES AHMAD DARAbstract. In this paper, we propose a parametric ‘useful’ mean codelength which is weighted by utilities <strong>and</strong> generalizes some well knownmean code lengths available in the literature. The object of this paper isto establish some results on noiseless coding theorems for the proposedparametric ‘useful’ mean code length in terms of generalized ‘useful’inaccuracy measure of order α <strong>and</strong> type β.1. IntroductionConsider the model given below for a finite r<strong>and</strong>om experiment schemehaving (x 1 , x 2 , . . . , x n ) as the complete system of events, happening withrespective probabilities P = (p 1 , p 2 , . . . , p n ) <strong>and</strong> credited with utilities U =(u 1 , u 2 , . . . , u n ) , u i > 0, i = 1, 2, . . . , n. Denote⎡χ = ⎣ x ⎤1 x 2 . . . x np 1 p 2 . . . p n⎦ . (1.1)u 1 u 2 . . . u nWe call (1.1) the utility information scheme.Let Q = (q 1 , q 2 , . . . , q n ) be the predicted distribution having the utilitydistribution U = (u 1 , u 2 , . . . , u n ), u i > 0, i = 1, 2, . . . , n. Taneja <strong>and</strong> Tuteja[19] have suggested <strong>and</strong> characterized the ‘useful’ inaccuracy measureI (P, Q; U) = −n∑u i p i log q i . (1.2)i=11991 Mathematics Subject Classification. 94A17, 94A24.Key words <strong>and</strong> phrases. Generalized inaccuracy measures, mean code word length,Hölder’s inequality.


138 M. A. K. BAIG AND RAYEES AHMAD DARBy considering the weighted mean code word length [6]L(U) =n∑u i p i l ii=1(1.3)n∑u i p ii=1where l 1 , l 2 , . . . , l n are the code lengths of x 1 , x 2 , . . . , x n respectively.Taneja <strong>and</strong> Tuteja [19] derived the lower <strong>and</strong> upper bounds on L (U) interms of I (P, Q; U).Bhatia [3] defined the ‘useful’ average code length of order t as⎡⎤n∑L t (U) = 1 u t+1ip i D tl it log i=1⎢( ⎣ n∑) t+1⎥ , −1 < t < ∞ (1.4)⎦u i p ii=1where D is the size of the code alphabet. He also derived the bounds forthe ‘useful’ average code length of order t in terms of generalized ‘useful’inaccuracy measure, given byunder the condition⎡ n∑⎤I α (P, Q; U) = 1 u1 − α log i p i qiα−1⎢ i=1⎥⎣ n∑ ⎦ , α > 0(≠ 1) (1.5)u i p in∑i=1i=1p i q −1iD −l i≤ 1 (1.6)where D is the size of the code alphabet. Inequality (1.6) is generalizedKraft’s inequality [5]. A code satisfying the generalized Kraft’s inequalitywould be termed as a personal probability code.Longo [12], Gurdial <strong>and</strong> Pessoa [7 ], Autar <strong>and</strong> Khan [1], Jain <strong>and</strong> Tuteja[9], Taneja et al [20], Hooda <strong>and</strong> Bhaker [8], Bhatia [2] <strong>and</strong> Singh, Kumar<strong>and</strong> Tuteja [18] considered the problem of ‘useful’ information measures <strong>and</strong>used it studying the noiseless coding theorems for sources involving utilities.In this paper, we study some coding theorems by considering a new functiondepending on the parameters α <strong>and</strong> β <strong>and</strong> a utility function. Ourmotivation for studying this function is that it generalizes some informationmeasures already existing in the literature.


SOME NOISELESS CODING THEOREMS 1392. Coding theoremsConsider a function⎡ n∑⎤Iα β (P, Q; U) = 1 u1 − α log i p β i qβ(α−1) i⎢ i=1⎥⎣ n∑u i p β ⎦ , α > 0(≠ 1), β > 0. (2.1)ii=1(i) When β = 1, (2.1) reduces to a measure of ‘useful’ informationmeasure of order α due to Bhatia [3].(ii) When β = 1, u i = 1, ∀i = 1, 2, . . . , n. (2.1) reduces to the inaccuracymeasure given by Nath [13], further it reduces to Renyi’s [14] entropyby taking p i = q i , ∀i = 1, 2, . . . , n.(iii) When β = 1, u i = 1, ∀i = 1, 2, . . . , n <strong>and</strong> α → 1. (2.1) reduces tothe measure due to Kerridge [10].(iv) When u i = 1, ∀i = 1, 2, . . . , n <strong>and</strong> p i = q i , ∀i = 1, 2, . . . , n the measure(2.1) becomes the entropy for the β-power distribution derivedfrom P studied by Roy [15]. We call I β α (P, Q; U) in (2.1) the generalized‘useful’ inaccuracy measure of order α <strong>and</strong> type β.Further consider⎡n∑L t β (U) = 1 t log i=1⎢⎣⎤u t+1ip β i Dtl i) t+1⎥( n∑u i p β ii=1, −1 < t < ∞. (2.2)⎦(i) For β = 1, L t β (U) in (2.2) reduces to the ‘useful’ mean length Lt (U)of the code given by Bhatia [3].(ii) For β = 1, u i = 1, ∀i = 1, 2, . . . , n, L t β(U) in (2.2) reduces to themean length given by Campbell [4].(iii) For β = 1, u i = 1, ∀i = 1, 2, . . . , n <strong>and</strong> α → 1, L t β(U) in (2.2)reduces to the optimal code length identical to Shannon [16].(iv) For u i = 1, ∀i = 1, 2, . . . , n, L t β(U) in (2.2) reduces to the meanlength given by Khan <strong>and</strong> Haseen [11].Now we find the bounds for L t β (U) in terms of Iβ α (P, Q; U) under theconditionn∑p β i q−β iD −l i≤ 1 (2.3)i=1where D is the size of the code alphabet.


140 M. A. K. BAIG AND RAYEES AHMAD DARTheorem 2.1. For every code whose lengths l 1 , l 2 , . . . , l n satisfies (2.3), thenthe average code length satisfieswhere α = 11+t, the equality occurs if <strong>and</strong> only ifL t β (U) ≥ Iβ α (P, Q; U) (2.4)l i = − logn∑i=1Proof. By Hölder’s inequality [17](n∑n∑x i y i ≥i=1i=1u i q αβiu i p β i qβ(α−1) ix p i) 1p ( n∑i=1y q i. (2.5)) 1q(2.6)where 1 p + 1 q = 1, p < 1(≠ 0), q < 0 or q < 1(≠ 0), p < 0 <strong>and</strong> x i, y i > 0, i =1, 2, . . . , n. We see the equality holds if <strong>and</strong> only if there exists a positiveconstant c such thatx p i = cyq i . (2.7)Making the substitutions p = −t, q =⎛x i = ut+1−(t )ip − β tiy i = u t+1t⎜⎝1+tβ(t )ipiin (2.6) <strong>and</strong> using (2.3), we get⎡⎤n∑u t+1ip β i Dl iti=1⎢( ⎣ n∑) 1+t⎥u i p β ⎦ii=11tt1+t1n∑u i p β ii=1⎛⎜⎝⎡≥ ⎢⎣⎞⎟⎠1n∑u i p β ii=1n∑i=1−( 1+tt )⎞⎟⎠1+ttu i p β i qβ(α−1) in∑u i p β ii=1D −l iq −βiTaking logarithms of both sides with base D, we obtain (2.4).Theorem 2.2. For every code whose lengths l 1 , l 2 , . . . , l n satisfies (2.3), L t β(U) can be made to satisfy the inequalityL t β (U) < Iβ α (P, Q; U) + 1. (2.8)⎤⎥⎦1+tt.□


SOME NOISELESS CODING THEOREMS 141Proof. Let l i be the positive integer satisfying− logn∑i=1Consider the interval⎡δ i =⎢⎣ − logu i q αβiu i p β i qβ(α−1) in∑i=1u i q αβiu i p β i qβ(α−1) i≤ l i < − log, − logn∑i=1n∑i=1u i q αβiu i p β i qβ(α−1) iu i q αβiu i p β i qβ(α−1) i+ 1+ 1. (2.9)⎤⎥⎦ (2.10)of length 1. In every δ i , there is exactly one positive integer l i such that0 < − logn∑i=1u i q αβiu i p β i qβ(α−1) i≤ l i < − logu i q αβi∑ + 1. (2.11)ui p β i qβ(α−1) iWe will first show that the sequences {l 1 , l 2 , . . . , l n }, thus defined satisfy(2.3). From (2.11) we havei.e.− logn∑i=1u i q αβiu i p β i qβ(α−1) in∑i=1u i q αβiu i p β i qβ(α−1) i≤ l i = − log D D −l i≥ D −l iMultiplying both sides by p β i q−β i<strong>and</strong> summing over i = 1, 2, . . . , n, we get(2.3). The last inequality in (2.11) givesi.e,l i < − logn∑i=1u i q αβiu i p β i qβ(α−1) iD l it


142 M. A. K. BAIG AND RAYEES AHMAD DARMultiplying both sides bygetn∑i=1u t+1i p β i ( n∑u i p β ii=1u t+1ip β i Dtl i) t+1 <strong>and</strong> summing over i = 1, 2, . . . , n, we⎡⎢n∑i=1u i p β i qβ(α−1) i( n∑) t+1< ⎣ n∑u i p β u i p β ⎦iii=1i=1Taking logarithms of both sides with base D <strong>and</strong> then dividing both sidesby t, we obtain (2.8).□⎤⎥t+1D tReferences[1] R. Autar <strong>and</strong> A. B. Khan, On generalized useful information for incomplete distribution,J. Comb. Inf. Syst. Sci., 14 (4) (1989), 187–191.[2] P. K. Bhatia, On a generalized ‘useful’ inaccuracy for incomplete probability distribution,Soochow J. Math., 25 (2) (1999), 131–135.[3] P. K. Bhatia, Useful inaccuracy of order α <strong>and</strong> 1.1 coding, Soochow J. Math., 21 (1)(1995), 81–87.[4] L. L. Campbell, A coding theorem of Renyi’s entropy, Inf. Control, 8 (1965), 423–429.[5] A. Feinstein, Foundation of Information Theory, Mc Graw Hill, New York.[6] S. Guiasu <strong>and</strong> C. F. Picard, Borne inferieture de la longuerur utile de certians codes,C. R. Acad. Sci., Paris, Ser. A–B, 273 (1971), 248–251.[7] Gurdial <strong>and</strong> F. Pessoa, On ‘useful’ information of order α, J. Comb. Inf. Syst. Sci.,2 (1977), 30–35.[8] D. S. Hooda <strong>and</strong> U. S. Bhaker, A Generalized ‘useful’ information measure <strong>and</strong> codingtheorem, Soochow J. Math., 23 (1) (1997), 53–62.[9] P. Jain <strong>and</strong> R. K. Tuteja, On coding theorem connected with ‘useful’ entropy of orderβ,Int. J. Math. Math. Sci., 12 (1) (1989), 193–198.[10] D. F. Kerridge, Inaccuracy <strong>and</strong> inference, J. R. Stat. Soc., Ser. B, 23 (1961), 184–194.[11] A. B. Khan <strong>and</strong> Haseen <strong>Ahmad</strong>, Some noiseless coding theorems of entropy of orderα of the power distribution P β , Metron, 39 (3–4) (1981), 87–94.[12] G. Longo, Quantitative-qualitative Measure of Information, Springer Verlag, NewYork, 1972.[13] P. Nath, An axiomatic characterization of inaccuracy for discrete generalized probabilitydistribution, Opsearch, 7 (1970), 115–133.[14] A. Renyi, On measure of entropy <strong>and</strong> information, Proceeding 4 th Berkley Symposiumon Mathematical Statistics <strong>and</strong> Probability, University of California Press,(1966), 547–561 .[15] L. K. Roy, Comparison of Renyi’s entropy of power distribution, ZAMM, 56 (1976),217–218.[16] C. E. Shannon, The mathematical theory communication, Bell. Syst. Tech. J, 27(1948), 379–423, 623–656.[17] O. Shisha, Inequalities, Academic Press, New York (1967).[18] R. P. Singh, R. Kumar <strong>and</strong> R. K. Tuteja, Applications of Hölders inequality in informationtheory, Inf. Sci., 152 (2003), 135–154.


SOME NOISELESS CODING THEOREMS 143[19] H. C. Taneja <strong>and</strong> R. K. Tuteja, Characterization of quantitative-qualititative measureof inaccuracy, Kibernetika, 2 (1985), 393–402.[20] H. C. Taneja, D. S. Hooda <strong>and</strong> R. K. Tuteja, Coding theorems on a generalized ‘useful’information, Soochow J. Math., 11 (1985), 123–131.(Received: September 28, 2006)(Revised: December 6, 2006)Department of StatisticsUniversity of KashmirSrinagar-190006IndiaE–mail: baigmak@yahoo.co.inE–mail: rayees stats@yahoo.com

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!