Some Inequalities in Information Theory Using Tsallis Entropy

We present a relation between Tsallis’s entropy and generalized Kerridge inaccuracy which is called generalized Shannon inequality and is well-known generalization in information theory and then give its application in coding theory. The objective of the paper is to establish a result on noiseless coding theorem for the proposed mean code length in terms of generalized information measure of order ξ.

i.e.,  (; ) = Entropy (3) was first of all characterized by Havrda and Charvát [1].Later on, Daróczy [2] and Behara and Nath [3] studied this entropy.Vajda [4] also characterized this entropy for finite discrete generalized probability distributions.Sharma and Mittal [5] generalized this measure which is known as the entropy of order  and type .Pereira and Gur Dial [6] and Gur Dial [7] also studied Sharma and Mittal entropy for a generalization of Shannon inequality and gave its applications in coding theory.Kumar and Choudhary [8] also gave its application in coding theory.Recently, Wondie and Kumar [9] gave a Joint Representation of Renyi's and Tsallis Entropy.Tsallis [10] gave its applications in physics for  ∈ Δ *  , and  → 1, (; ) reduces to Shannon [11] entropy Inequality (6) has been generalized in the case of Renyi's entropy. 2 International Journal of Mathematics and Mathematical Sciences

Mean Codeword Length and Its Bounds
We will now give an application of inequality (6) in coding theory for Let a finite set of  input symbols be encoded using alphabet of  symbols, then it has been shown by Feinstein [13] that there is a uniquely decipherable code with lengths  1 ,  2 , . . .,   if and only if the Kraft inequality holds; that is, where  is the size of code alphabet.Furthermore, if is the average codeword length, then for a code satisfying ( 16), the inequality is also fulfilled and equality holds if and only if and by suitable encoded into words of long sequences, the average length can be made arbitrarily close to (), (see Feinstein [13]).This is Shannon's noiseless coding theorem.By considering Renyi's entropy (see, e.g., [14]), a coding theorem, analogous to the above noiseless coding theorem, has been established by Campbell [15] and the authors obtained bounds for it in terms of Kieffer [16] defined a class rules and showed that   () is the best decision rule for deciding which of the two sources can be coded with expected cost of sequences of length  when  → ∞, where the cost of encoding a sequence is assumed to be a function of length only.Further, in Jelinek [17] it is shown that coding with respect to Campbell's mean length is useful in minimizing the problem of buffer overflow which occurs when the source symbol is produced at a fixed rate and the code words are stored temporarily in a finite buffer.Concerning Campbell's mean length the reader can consult [15].It may be seen that the mean codeword length (17) had been generalized parametrically by Campbell [15] and their bounds had been studied in terms of generalized measures of entropies.Here we give another generalization of ( 17) and study its bounds in terms of generalized entropy of order .
Generalized coding theorems by considering different information measure under the condition of unique decipherability were investigated by several authors; see, for instance, the papers [6,13,18].
An investigation is carried out concerning discrete memoryless sources possessing an additional parameter  which seems to be significant in problem of storage and transmission (see [9,[16][17][18]).
In this section we study a coding theorem by considering a new information measure depending on a parameter.Our motivation is, among others, that this quantity generalizes some information measures already existing in the literature such as Arndt [19] entropy, which is used in physics.Definition 1.Let  ∈ N,  > 0( ̸ = 1) be arbitrarily fixed, then the mean length () corresponding to the generalized information measure (; ) is given by the formula where  = ( 1 , . . .,   ) ∈ Δ *  and ,  1 ,  2 , . . .,   are positive integers so that Since ( 22) reduces to Kraft inequality when  = 1, therefore it is called generalized Kraft inequality and codes obtained under this generalized inequality are called personal codes.
From the left inequality of (32), we have International Journal of Mathematics and Mathematical Sciences multiplying both sides by  −1  and then taking sum over , we get the generalized inequality (22).So there exists a generalized code with code lengths   ,  = 1, . . ., .

Conclusion
In this paper we prove a generalization of Shannon's inequality for the case of entropy of order  with the help of Hölder inequality.Noiseless coding theorem is proved.Considering Theorem 2 we remark that the optimal code length depends on  in contrast with the optimal code lengths of Shannon which do not depend of a parameter.However, it is possible to prove coding theorem with respect to (3) such that the optimal code lengths are identical to those of Shannon.