Three-Parameter Logarithm and Entropy

A three-parameter logarithmic function is derived using the notion of q-analogue and ansatz technique. The derived three-parameter logarithm is shown to be a generalization of the two-parameter logarithmic function of Schwammle and Tsallis as the latter is the limiting function of the former as the added parameter goes to 1. The inverse of the three-parameter logarithm and other important properties are also proved. A three-parameter entropic function is then defined and is shown to be analytic and hence Lesche-stable, concave and convex in some ranges of the parameters.


Introduction
The concept of entropy provides deep insight into the direction of spontaneous change for many everyday phenomena. For example, a block of ice placed on a hot stove surely melts, while the stove grows cooler. Such a process is called irreversible because no slight change will cause the melted water to turn back into ice while the stove grows hotter [7]. The concept of entropy was first introduced by German physicist Rudolf Clausius as a precise way of expressing the second law of thermodynamics.

The Boltzmann equation for entropy is
where k B is the Boltzmann constant [10] and ω is the number of different ways or microstates in which the energy of the molecules in a system can be arranged on energy levels [9]. The Boltzmann entropy plays a crucial role in the foundation of statistical mechanics and other branches of science [5].
The Boltzmann-Gibbs-Shannon entropy [13,14] is given by where ω i=1 p i = 1. Systems presenting long range interactions and/or long duration memory have been shown not well described by the Boltzmann-Gibbs statistics. Some examples may be found in gravitational systems, Lévy flights, fractals, turbulence physics and economics. In an attempt to deal with such systems Tsallis [15] postulated a nonextensive entropy which generalizes Boltmann-Gibbs entropy through an entropic index q [3]. Another generalization was also suggested by Renyi [11]. Abe [1] proposed how to generate entropy functionals.
Tsallis q-entropy [15] is given by which is referred to as q-logarithm. If p i = 1 ω for all i, then S q = k ln q ω. (1.7) The inverse of the q-logarithm is the q-exponential where [· · · ] + is zero if its argument is nonpositive.
A q-sum and q-product and their calculus studied in [4] were respectively defined as follows (these were also mentioned in [13]): The q-logarithm satisfies the following properties: ln q (xy) = ln q x ⊕ q ln q y (1.11) ln q (x ⊗ q y) = ln q x + ln q y. (1.12) Then a two-parameter logarithm was defined and presented along with a two-parameter entropy in [13]. It was defined as follows: (1.13) The above doubly deformed logarithm satisfies ln q,q (x ⊗ q y) = ln q,q x ⊕ q ln q,q y. (1.14) Properties of the two-parameter logarithm and those of the two-parameter entropy were proved in [13]. Probability distribution in the canonical ensemble of the two-parameter entropy was obtained in [2] while applications were discussed in [6].
In section 2 of the present paper, a three-parameter logarithm ln q,q ,r x, where q, q , r ∈ R, is derived using q-analogues and ansatz technique. In section 3, the inverse of the three-parameter logarithm is derived and some properties are proved. A three-parameter entropy and its properties are presented in section 4 and conclusion is given in section 5.

Three-Parameter Logarithm
As x = e ln x , a q-analogue of x will be defined by where ln q x is defined in (1.6). Similarly, the q -analogue of [x] q is defined by where ln q,q x is as defined in (1.13), which can be written 3) The three-parameter logarithm is then defined as To obtain similar property as that in (1.14), define x⊗ q,q y as the q -analogue of x ⊗ q y. That is, Then, from (2.4) and (2.6) In similar manner and using (2.2), [ln q,q ,r y] (2.9) = ln q,q ,r x ⊕ r ln q,q ,r y, (2.10) which is the desired relation analogous to (1.14). One can also derive (2.5) using ansatz. To do this, let x = y in (2.10). Then ln q,q ,r (x ⊗ q x) = ln q,q ,r x ⊕ r ln q,q ,r x.
Thus, from (2.9) and (2.10), (2.14) The ansatz which means that (2.15) solves the equation Thus, Using the property that d dx ln q,q ,r x x=1 = 1, which is a natural property of a logarithmic function, it is determined that b = e 1−r . Consequently, ln q,q ,r x = 1 1 − r e (1−r) ln q,q x − 1 . (2.17) Explicitly, ln q,q ,r x = ln q,q x. (2.20) Graphs of ln q,q ,r x for q = q = r are shown in Figure 1 while graphs of ln q,q ,r x with one fixed parameter are shown in Figure 2.

Properties
In this section the inverse of the three-parameter logarithmic function will be derived. it is also verified that the derivative of this logarithm at x = 1 is 1 and that the value of the function at x = 1 is zero. Moreover, it is shown that the following equality holds ln q,q ,r 1 x = − ln 2−q,2−q ,2−r x. It follows from (2.4) that the three-parameter logarithmic function is an increasing function of x. Thus, a unique inverse function exists. To find the inverse function let y = ln q,q ,r (x) and solve for x. That is, from which (3.2) Thus,the inverse function is given by e y q,q ,r = exp q,q ,r y = 1 + where the q-exponential e x q is defined in (1.8).

Three-Parameter Entropy
A three-parameter generalization of the Boltzmann-Gibbs-Shannon entropy is constructed here and its properties are proved. Based on the threeparameter logarithm the entropic function is defined as follows: where w is the number of states. Lesche-stability (or experimental robustness). The functional form of ln q,q ,r x given in the previous section is analytic in x as ln q,q x is analytic in x. Consequently S q,q ,r is Lesche-stable.
Expansibility. An entropic function S satisfies this condition if a zero-probability (p i = 0) state does not contribute to the entropy. That is, S(p 1 , p 2 , . . . , p w , 0) = S(p 1 , p 2 , . . . , p w ) for any distribution {p i }. Observe that in the limit p i = 0, ln q,q ,r 1 p i is finite if one of q, q , r is greater than 1. Consequently, S q,q ,r (p 1 , p 2 , . . . , p w , 0) = S q,q ,r (p 1 , p 2 , . . . , p w ) (4.3) provided that one of q, q , r is greater than 1.
Concavity. Concavity of the entropic function S q , q , r is assured if in the interval 0 ≤ p i ≤ 1. By manual calculation (which is a bit tedious) and checked using derivative calculator, (4.5) In the limit p i → 1, the second derivative given in (4.5) is less than zero if q + q + r > 2. Thus, concavity of S q,q ,r is guaranteed if q + q + r > 2. In the limit p i → 0, concavity is guaranteed if r > 1. If r < 1, concavity holds if q > 1.

Convexity.
A twice-differentiable function of a single variable is convex if and only if its second derivative is nonnegative on its entire domain. The analysis on the convexity of S q,q ,r is analogous to that of its concavity. In the limit p i → 1, convexity is guaranteed if q + q + r ≤ 2. In the limit p i → 0, convexity is assured if q, r < 1.
Concavity of S q,q r is illustrated in Figure 3 (A) while convexity is illustrated in Figure 3 (B).
Composability. An entropic function S is said to be composable if for events  where Φ is some single-valued function [13]. The Botzmann-Gibbs-Shannon entropy satisfies hence it is composable and additive. The one-parameter entropy S q , for q = 1 is also composable as it satisfies The two-parameter entropy S q,q [13] satisfies, in the microcanonical ensemble (i.e. equal probabilities), that where However, this does not hold true for arbitrary distributions {p i }, which means S q,q is not composable in general. For the 3-parameter entropy S q,q ,r a similar property as that of (4.7) is obtained as shown below.

Conclusion
It is shown that the two-parameter logarithm of Schwammle and Tsallis [13] can be generalized to three-parameter logarithm using q-analogues. Consequently, a three-parameter entropic function is defined and its properties are proved. It will be interesting to study applicability of the three-parameter entropy to adiabatic ensembles [6] and other ensembles [12] and how these applications relate to generalized Lambert W function.