Refinements of Jensen’s Inequality via Majorization Results with Applications in the Information Theory

In this study, we present some new refinements of the Jensen inequality with the help of majorization results. We use the concept of convexity along with the theory of majorization and obtain refinements of the Jensen inequality. Moreover, as consequences of the refined Jensen inequality, we derive some bounds for power means and quasiarithmetic means. Furthermore, as applications of the refined Jensen inequality, we give some bounds for divergences, Shannon entropy, and various distances associated with probability distributions.


Introduction
e Jensen inequality is one of the most significant and fundamental inequalities in the current literature of mathematical inequalities. e Jensen inequality got more importance due to the fact that many inequalities such as Hölder inequality, Minkowski's inequality, Ky Fan's inequality, Levinson's inequality, and Hermite-Hadamard inequality can be obtained from this inequality. Also, many problems arising in different fields of science can be solved with the help of the Jensen inequality [1][2][3]. Moreover, several applications of the Jensen inequality have been observed in the estimations of several divergences and the well-known Zipf-Mandelbrot entropy [4][5][6][7][8][9]. Due to the wide significance of the Jensen inequality, many researchers gave several extensions, improvements, generalizations, and refinements of this inequality. Furthermore, this inequality has been also used as a foundation for exploring the generalized convex functions such as s-convex function, η-convex function, coordinate convex function, and h-convex function. e discrete form of the Jensen inequality is given in the following theorem. Theorem 1. Assume that Ψ: I ⟶ R is a convex function (I is an arbitrary interval), ζ k ∈ I and η k ≥ 0, for all k ∈ 1, 2, . . . , n { }, with η � n k�1 η k > 0, then If a function Ψ is concave, then (1) is valid in the contrary direction.
One of the most attractive features of the Jensen inequality is that it generalized the definition of the convex function. Recently, many researchers devoted their attention to the Jensen inequality and contributed to it in various directions. In 2006, Dragomir [10] presented new refinements of the Jensen inequality by using a convex function defined on the linear space. He also gave some applications of the obtained results in the information theory. Pečarić and Perić [11] established some improvements of the converse of the Jensen inequality for linear functional, and some applications of the obtained results were also presented therein. Xiao et al. [12] obtained a new extension and refinement of the Jensen inequality for midconvex functions by using the arithmetic means of the given function.
In the remaining portion of the present section, we introduce some fundamental theory and essential results of majorization, which will help us in the obtaining our new results.
Majorization is a connection between two vectors, which shows that the components of one vector is "more nearly equal" than the components of the second vector. Due to this property, the concept of majorization becomes a focus point for the researchers of different fields. Many mathematical inequalities have been established with the help of majorization [13]. Over the years, the concept of majorization has been applied widely as a powerful tool in the area of pure and applied mathematics, as well as some other related fields. Majorization also performed an extremely significant character in the theory of matrices. Several matrix inequalities and norm inequalities have been deduced by the majorization relation among the Eigen values and singular values of matrices [14]. For some literature related to majorization theory, refer [15][16][17][18].

Theorem 4.
Suppose that a function Ψ: I ⟶ R is convex and s 1 � (ζ 1 , ζ 2 , . . . , ζ n ) and s 2 � (ϑ 1 , ϑ 2 , . . . , ϑ n ) are any two n-tuples, such that ζ k , ϑ k ∈ I(k � 1, 2, . . . , n), and s 3 � (η 1 , η 2 , . . . , η n ) is a nonnegative real n-tuple. Also, suppose that condition (7) is satisfied. If s 2 and s 1 -s 2 are monotonic in the parallel direction, then Remark 1. Dragomir also obtained inequality (9) by using strict condition that Ψ is an increasing convex function while using the relax condition that n k�1 η k ϑ k ≤ n k�1 η k ζ k instead of using (7). e majorization inequalities are generalized, extended, and refined in many directions. Adil Khan et al. [22] extended the majorization type inequalities from the convex functions defined on the intervals to the convex functions defined on the rectangles. ey used the main result to obtain some extensions for the weighted Favard's inequality. Wu et al. [23] obtained some refinements of majorizationtype inequalities with the help of Taylor theorem with the mean value form of the remainder and also presented an application. Ullah et al. [24] established some improvements and generalizations of majorization-type inequalities with the help of a strongly convex function and also gave some applications of the obtained results. e aim of this study is to establish some refinements of the famous Jensen equality with the help of majorization theory. We use some basic and fundamental results of majorization to obtain some new refinements of the Jensen inequality. We also establish refined inequalities for different means. As applications, we obtain new bounds for the Csiszár divergence, Kullback-Leibler divergence, and Shannon entropy. Furthermore, we also establish bounds for the various distances, Bhattacharyya coefficient, and triangular discrimination.

Main Results
We begin this section with the following result, in which we obtain a refinement of the Jensen inequality with the help of eorem 2.
Theorem 5. Let Ψ: I ⟶ R be a convex function, and let If Ψ is a concave function, then the inequalities given in (10) are true in the opposite direction.
Proof. Since s 2 ≺s 1 , using the second condition of majorization, we have Now, applying Jensen's inequality to the right hand side of (11), we get Applying the definition of the convex function to the right hand side of (12), we obtain Instantly, applying Jensen's inequality to the first term on the right hand side of (13), we acquire As s 2 ≺s 1 , using majorization theorem, we have Now, using (15) in (14), we obtain From (12)- (14) and (16), we obtain (10).
Replacing Ψ with −Ψ, we deduce the opposite inequalities for the case that Ψ is a concave function. is completes the proof of eorem 5.
Proof. By using (7), we may write Using the Jensen inequality on the right hand side of (20), we obtain Now, applying the definition of the convex function on the right hand side of (21), we get Now, applying the Jensen inequality to the first term on the right hand side of (22), we acquire Since all of the conditions of eorem 3 are satisfied in the present assumptions, using eorem 3 gives Now, using (24), we obtain From (21)- (23) and (25), it follows (19).
Replacing Ψ with −Ψ, we deduce the opposite inequalities for the case that Ψ is a concave function. e proof of eorem 6 is complete.

Applications to Inequalities Involving Special Means
In this section, we give some applications of eorem 6 for power means and quasiarithmetic means.

Proof.
Using (19) for this implies that which is equivalent to (48).

Applications to Information Theory
e information theory is the mathematical treatment of the concepts, framework, and rule governing the spread of messages through communication systems. e information theory has been used very extensively in the telecommunication systems, and it has also a great role in the study of linguistic studies such as in the sense of speed of reading, length of words, and frequency of words. In this section, we give some meaningful applications of our main results in the area of information theory. We will establish bounds for the different divergences and Shannon entropy. We also establish bounds for various distances, Bhattacharyya coefficient, and triangular discrimination.