Improvements of Jensen-Type Inequalities for Diamond-α Integrals

We give further improvements of the Jensen inequality and its converse on time scales, allowing also negative weights. These results generalize the Jensen inequality and its converse for both discrete and continuous cases. Further, we investigate the exponential and logarithmic convexity of the differences between the left-hand side and the right-hand side of these inequalities and present several families of functions for which these results can be applied.


Preliminaries
The combined dynamic derivative, also called diamond-(⬦ ) dynamic derivative ( ∈ [0, 1]), was introduced as a linear convex combination of the well-known delta and nabla dynamic derivatives on time scales. By a time scale T we mean any nonempty closed subset of real numbers. Using the delta and nabla derivatives, the notions of delta and nabla integrals were defined (see [1]). We assume, throughout this paper, that the basic notions of the time scales are well known and understood.
In the following lemma he gives a characterization for -SP weight for a nondecreasing function on time scales.
As given in [4], all positive weights are -SP weights, for any continuous function and every ∈ [0,1]. But there are some -SP weights that are allowed to take the negative values. The Jensen inequality on time scales, where it is allowed that the weight function takes some negative values, is given in the following theorem.
Remark 7. Let be nondecreasing function. If T = N, then Theorem 6 is equivalent to the Jensen-Steffensen inequality given by Steffensen in [5] (see also [6, page 57]). On the other hand if we take T = R in Theorem 6, we obtain the integral version of the Jensen-Steffensen inequality given by Boas [7] (see also [6, page 59]).
Considering the converse of the Jensen inequality, Dinu [4] gives the following definition of -Hermite-Hadamard ( -HH) weight, its characterization for a nondecreasing function on time scales, and the improvement of the converse of the Jensen inequality for some negative weights.
In the next result Dinu [4] gives the connection between these two classes of weights on a time scale.
In the following two sections of our paper we give some further generalizations of the Jensen-type inequalities on time scales allowing negative weights, and we also give the mean-value theorems of the Lagrange and Cauchy type for the functionals obtained by taking the difference of the lefthand side and right-hand side of these new inequalities. These results also generalize the results given in [8] for continuous and discrete cases. Section 4 in our paper deals with exponential convexity and logarithmic convexity of the functionals obtained in two previous sections. Finally, in Section 5 we present several families of exponentially convex functions which fulfil the conditions of our results. The results from Sections 4 and 5 generalize the results given in [9] for continuous and discrete cases.

Improvement of the Jensen Inequality on Time Scales
Let , ∈ R, where ̸ = . Consider the Green function The function is convex and continuous with respect to both and .
In the following theorem, we give the generalization of the Jensen inequality on time scales, where negative weights are also allowed.
If the function Φ is also convex, then Φ ( ) ≥ 0 for all ∈ [ , ], and hence it follows that for every convex function Φ ∈ C 2 ([ , ], R) inequality (17) holds. Moreover, it is not necessary to demand the existence of the second derivative of the function Φ (see [6, page 172]). The differentiability condition can be directly eliminated by using the fact that it is possible to approximate uniformly a continuous convex function by convex polynomials. The last part of our theorem can be proved analogously.  Moreover, statements (i ) and (ii ) are also equivalent if we change the sign of inequality in both statements (i ) and (ii ).
Remark 13. Consider (19). Suppose that is nondecreasing and that it has the first derivative. Let = ( ) and = ( ), and make the substitution = ( ). Then we get Since is nondecreasing, we have that holds for all ∈ [ , ] T , then for every continuous convex function Φ inequality (17) holds.
Combining the result from Theorem 11 with Theorem 6 and Lemma 5, we get the following two corollaries. holds for all ∈ [ , ], where is defined in (15).
hold for all ∈ [ , ] T , if and only if holds for all ∈ [ , ], where is defined in (15).
To shorten the notation, in the sequel we will use the following notation: Under the assumptions of Theorem 11, we define the following functional J 1 ( , Φ): where the function Φ is defined on [ , ]. Clearly, if Φ is continuous and convex, then J 1 ( , Φ) is nonnegative.
Let J 1 be defined as in (26). Then there exists ∈ [ , ] such that Proof. Since the function Φ is continuous and does not change its positivity on [ , ], applying the integral mean-value theorem on (19) we get that there exists International Scholarly Research Notices 5 holds. As in [12], it can be easily checked that it holds Calculating the integral on the right-hand side of (29), we get and the proof is completed.
Remark 17. Theorem 16 can also be proved by using the following two convex functions: Since 1 and 2 are continuous and convex, we have This implies that holds, provided that the denominator on the left-hand side of (36) is nonzero.
Proof. Let be defined as the linear combination of functions Φ and Ψ by Then ∈ C 2 ([ , ], R). By applying Theorem 16 on , it follows that there exists ∈ [ , ] such that After a short calculation we get that J 1 ( , ) = 0. By which is equivalent to (36).
Remark 19. In Theorem 18, if the inverse of the function Φ /Ψ exists, then (36) gives Remark 20. Note that setting the function Ψ as Ψ( ) = 2 /2 in Theorem 18, we get the statement of Theorem 16.

6
International Scholarly Research Notices such that (36) holds.

Improvement of the Converse of the Jensen Inequality on Time Scales
Using the similar method as in previous section, in the following theorem we obtain the generalization of the converse of the Jensen inequality on time scales, where negative weights are also allowed.
Then, the following two statements are equivalent.

Furthermore, the statements (i) and (ii) are also equivalent if we change the sign of inequality in both
Proof. The idea of the proof is very similar to the proof of Theorem 11.
If the function Φ is also convex, then Φ ( ) ≥ 0 for all ∈ [ , ], and hence it follows that for every convex function Φ ∈ 2 ([ , ], R) the inequality (41) holds. Moreover, it is not necessary to demand the existence of the second derivative of the function Φ (see [6, page 172]). The differentiability condition can be directly eliminated by using the fact that it is possible to approximate uniformly a continuous convex function by convex polynomials.
The last part of our theorem can be proved analogously.
Remark 24. Let the conditions of Theorem 23 hold. Then the following two statements are equivalent. holds.
Let ∫ ( )⬦ > 0, and suppose that is nondecreasing and that it has the first derivative. Now, similarly as in [4], we derive the result from Lemma 9. Let = ( ) and = ( ), and make the substitution = ( ). Then we get Since is nondecreasing, we have that Proof. The proof follows directly from Theorem 10 and Corollary 26.
Under the assumptions of Theorem 23, we define the following functional J 2 ( , Φ): where the function Φ is defined on [ , ]. Clearly, if Φ is continuous and convex, then J 2 ( , Φ) is nonnegative. 8

International Scholarly Research Notices
Calculating the integral on the right-hand side of (53), we get and we get statement (51) of our theorem.
Remark 30. Note that (54) can also be expressed as . (55) Let J 2 be defined as in (50). Then there exists ∈ [ , ] such that holds, provided that the denominator on the left-hand side of (56) is nonzero.
Proof. The proof is very similar to the proof of Theorem 18.

Exponential and Logarithmic Convexity
First we recall some definitions and facts about exponentially convex and logarithmically convex functions (see, e.g., [13,14] or [9]) which we need for our results. The following lemma is equivalent to the definition of convex function (see [6, page 2]).
Lemma 38. If 1 , 2 , 3 ∈ are such that 1 < 2 < 3 , then the function : → R is convex if and only if the following inequality holds: We will also need the following result (see, e.g., [6]).

Proposition 39. If
: → R is a convex function and 1 , 2 , 1 , 2 ∈ are such that 1 ≤ 1 , 2 ≤ 2 , 1 ̸ = 2 , and 1 ̸ = 2 , then the following inequality is valid: When dealing with functions with different degree of smoothness, divided differences are found to be very useful.
International Scholarly Research Notices 9 Definition 40. The second order divided difference of a function : → R at mutually different points 0 , 1 , 2 ∈ is defined recursively by Remark 41. The value [ 0 , 1 , 2 ; ] is independent of the order of the points 0 , 1 , and 2 . This definition may be extended to include the case in which some or all the points coincide (see [6, page 16]). Namely, taking the limit 1 → 0 in (61), we obtain provided that exists, and furthermore, taking the limits → 0 , = 1, 2, in (61), we obtain provided that exists. A function : → R is convex if and only if for every choice of three mutually different points 0 , 1 , 2 ∈ [ 0 , 1 , 2 ; ] ≥ 0 holds. Now, we use an idea from [15] to give an elegant method of producing an -exponentially convex and exponentially convex functions applying the functionals J 1 and J 2 on a given family of functions with the same property.
The following corollary is an immediate consequence of Theorem 42. (ii) If → J ( , Φ ) is continuous on , then it is also 2-exponentially convex on . If → J ( , Φ ) is additionally strictly positive, then it is also log-convex on , and for , , ∈ such that < < , we have for Φ , Φ ∈ Ω.
Proof. (i) and the first part of (ii) are immediate consequences of Theorem 42. If → J ( , Φ ) is continuous and strictly positive, its log-convexity is an immediate consequence of Remark 37. Now applying Lemma 38 on the function ( ) = log J ( , Φ ) and , , ∈ ( < < ), we get which is equivalent to inequality (67).
To prove (iii), let → J ( , Φ ) be strictly positive and differentiable and therefore continuous too. By (ii), the function → J ( , Φ ) is log-convex on ; that is, the function → log J ( , Φ ) is convex on , and by Proposition 39 we obtain for ≤ , ≤ V, ̸ = , and ̸ = V, concluding that The cases = and = V follow from (71) as limit cases.
Remark 45. Note that the results from Theorem 42, Corollary 43, and Corollary 44 still hold when two of the points 0 , 1 , 2 ∈ [ , ] coincide, for a family of differentiable functions Φ such that the function → [ 0 , 1 , 2 ; Φ ] is -exponentially convex in the Jensen sense (exponentially convex in the Jensen sense and log-convex in the Jensen sense), and furthermore, they still hold when all three points coincide for a family of twice differentiable functions with the same property. The proofs are obtained by recalling Remark 41 and suitable characterization of convexity.

Examples
In this section we will vary on choice of a family Ω = {Φ : ∈ } in order to construct different examples of exponentially convex functions and construct some means.
Example 46. Consider a family of functions It is ( 2 / 2 ) ( ) = > 0 which shows that is convex on R for every ∈ R. From Remark 36, it follows that → ( 2 / 2 ) ( ) is exponentially convex. Therefore, → [ 0 , 1 , 2 ; ] is exponentially convex (see [15]) (and so exponentially convex in the Jensen sense). Now using Corollary 43 we conclude that → J ( , ), ( = 1, 2) are exponentially convex in the Jensen sense. It is easy to verify that these mappings are continuous, so they are exponentially convex.
For this family of functions, M , ( , J , Ω) from (69) becomes and using (68) we have that it is monotonous in parameters and .