Comparative Study of Generalized Quantitative-Qualitative Inaccuracy Fuzzy Measures for Noiseless Coding Theorem and 1:1 Codes

In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, and network coding. The study of codes is introduced in Information Theory, electrical engineering, mathematics, and computer sciences for the transmission of data through reliable and efficient methods. We have to consider how coding of messages can be done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time. Thus, the minimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded. In this paper, we have introduced mean codeword length of order α and type β for 1:1 codes and analyzed the relationship between average codeword length and fuzzy information measures for binary 1:1 codes. Further, noiseless coding theorem associated with fuzzy information measure has been established.


Introduction
In the 1940s, a new branch of mathematics was introduced as Information Theory.Information Theory considered the problems of how to process information, how to store information, how to retrieve information, and hence decisionmaking.Information Theory deals with the study that how information can be transmitted over communication channels.In 1924 and 1928, Nyquist [1,2] first studied the information measures and Hartley [3] studied that information measures were logarithmic in nature.New properties of information sources and communication channels were published by Shannon [4] in his research paper "A Mathematical Theory of Communication." Wiener [5] also started considering results similar to Shannon.A large number of applications of Information Theory in various fields of social, physical, and biological sciences have been developed, for example, economics, statistics, accounting, language, psychology, ecology, pattern recognition, computer sciences, and fuzzy sets.Later, Rényi [6] introduced entropy of order , which is the limiting case of Shannon entropy.
Fuzzy set theory was introduced by Zadeh [7].It relates to the uncertainty occurring in the human cognitive processes.Fuzzy set theory has found applications in various engineering, biological sciences, and business.Zadeh introduced fuzzy entropy, a measure of fuzzy information based on Shannon's entropy.
Let  be a fuzzy set in the universe of discourse Ω.We define membership function as   : Ω → [0, 1], where   () is a degree of membership for each  ∈ Ω.
The following properties were given by De Luca and Termini [8] which are to be satisfied by fuzzy entropy.
(1) Fuzzy entropy is the minimum iff set is crisp.
(2) Fuzzy entropy is the maximum when membership value is 0.5.(3) Fuzzy entropy decreases if set is sharpened.(4) Fuzzy entropy of a set is the same as its complement.
In coding theory, we study various properties of codes for application in data compression, cryptography, error correction, and network coding.The study of codes is introduced 2 International Journal of Mathematics and Mathematical Sciences in Information Theory, electrical engineering, mathematics, and computer sciences for the transmission of data through reliable and efficient methods.We have to consider how coding of messages can be done efficiently so that the maximum number of messages can be sent over a noiseless channel in a given time.Thus, the minimum value of mean codeword length subject to a given constraint on codeword lengths has to be founded.Let  be random variable with each value   ,  = 1, 2, . . ., , that generates messages represented by the set { 1 ,  2 , . . .,   } which need to be transmitted.This set is called code alphabet and sequence assigned to each   is called codeword.The codeword length denoted by   associated with   satisfying Kraft's inequality is given by where  is the size of alphabet.Further, codes are chosen to minimize average codeword length which is given by where   is the probability of the occurrence of   .Now, Shannon's [4] noiseless coding theorem for uniquely decipherable codes is given by finding the lower bounds on  in terms of Shannon's entropy ().
Campbell [9] proposed a special exponentiated mean codeword length of order  for uniquely decipherable codes given by and proved a noiseless coding theorem.Campbell analyzed that its lower bound lies between   () and   () + 1, where The above is Renyi's measure of entropy of order .As  → 1, it is easily shown that   →  and   () approaches ().
For uniquely decipherable code, a weighted average length was given by Guiasu and Picard [10]: This is defined as the average cost of transmitting letters   with probability   and utility   .
A quantitative-qualitative measure of entropy was defined by Belis and Guiasu [11] for which Longo [12] gave the lower bound for useful mean codeword length.Further, a noiseless coding theorem was verified by Guiasu and Picard by introducing lower bound for useful mean codeword length.And Gurdial and Pessoa [13] proved noiseless coding theorem by giving lower bounds for useful mean codeword lengths of order  in terms of useful measures of information of order .The following information measure was introduced by Belis and Guiasu: Similarly, quantitative-qualitative information measure was given by Taneja and Tuteja: Further, Bhaker and Hooda gave the following information measures: Now, Baig and Dar [14,15] proposed a new considered fuzzy information measure of order  and type : And for this information measure they introduced the given mean code length of order  and type : Choudhary and Kumar [16] proved some noiseless coding theorem on generalized R-Norm entropy.Also, Choudhary and Kumar [17] proposed some coding theorems on generalized Havrda-Charvat and Tsallis entropy.Baig and Dar [14,15] introduced few coding theorems on fuzzy entropy function depending upon parameters R and V and gave fuzzy coding theorem on generalized fuzzy cost measure.Taneja and Bhatia [18] proposed a generalized mean codeword length for the best 1:1 code and Parkash and Sharma [19,20] proved some noiseless coding theorems corresponding to fuzzy entropies and introduced a new class of fuzzy coding theorems.Parkash [21] introduced a new parametric measure of fuzzy entropy.Gupta et al. [22] proposed 1:1 codes for generalized quantitative-qualitative measure of inaccuracy.
International Journal of Mathematics and Mathematical Sciences 3 Jain and Tuteja [23] introduced a coding theorem connected with useful entropy of order  and type .Tuli [24] introduced mean codeword lengths and their correspondence with entropy measures.Tuli and Sharma [25] proved some new coding theorems and consequently developed some new weighted fuzzy mean codeword lengths corresponding to the well-known measures of weighted fuzzy entropy.
In the next section, we will prove the fuzzy noiseless theorem for 1:1 codes of binary size and hence prove that they are less constrained.

Conclusion
In modern data transmission and storage systems, the key ingredients that help in achieving the high degree of reliability are the error correcting codes.The noiseless channel faces the problem of efficient coding of messages and hence we have to maximize the number of messages to be sent through a channel in a given time.Thus, we have introduced mean codeword length of order  and type  for 1:1 codes and analyzed the relationship between average codeword length and fuzzy information measures for binary 1:1 codes.Further, noiseless coding theorems associated with fuzzy information measure have been established.

Future Research Endeavors
Decision-making problems in general utilize information from the knowledge base of experts in view of their perception about that system.Besides mathematical studies, algorithms for application in decision-making can also be discussed in extension of the above research.Comparison of other fuzzy information measures in the light of fuzzy noiseless theorem and 1:1 codes of binary size can also be proved and compared.Further, this fuzzy noiseless coding theorem can be utilized for characterizing and applying R-Norm entropy measures.