© Hindawi Publishing Corp. ON STOCHASTIC COMPARISONS OF ENERGY FUNCTIONS WITH APPLICATIONS

(Received 1 March 2001) Abstract. We develop simple methods for the stochastic comparisons of informational energy functions. We introduce modified informational energy functions and uncertainty of parameter functions are introduced for models with realistic parameter spaces. We present inequalities, comparisons, and applications including test procedures for test- ing the equality of informational energy functions. Some illustrative examples are also presented.

procedures based on informational energy functions.Some applications and examples are given in Section 5.This paper concludes with a discussion in Section 6.
2. Some definitions, utility notions, and comparisons.In this section, we present some definitions and useful notions.Let Ᏺ be the set of absolutely continuous distribution function satisfying Note that if the mean of a random variable in Ᏺ is finite, it is positive.The informational energy associated with P θ is given by e(θ) = R p 2 θ (x)dλ(x).
(2.2) Definition 2.1.Let f and g be two probability density functions.The distance between f and g is 3) The Hellinger type integral of order 1/2 is given by Note that, (2.5) Definition 2.2.Let u and v be two nonnegative bounded real functions on R. We say u is exponentially dominated by v if for each ∈ (0, 1), there exist A( ) < ∞ such that (2.6) If u and v are exponentially dominated by each other, they are said to be exponentially equivalent.
The usefulness of the above definition is in the comparisons of small values of bounded nonnegative functions u and v, respectively.
Let f and g be two nonnegative functions, possibly probability density functions that are integrable with respect to a σ -finite measure λ, and define e(f , g) = max f 2 ,g 2 dλ.
Theorem 2.3.Let f and g be probability density functions (pdf), then where (2.9) Then we have (2.10) and the result follows.
The next result compares the informational energies e(f ) and e(g).
(2) Note that {f * n } n≥1 and {g * n } n≥1 , are bounded sequences, so there exists a convergent subsequence such that e(f ) = lim j→∞ e(f n j ) and e(g) = lim j→∞ e(g n j ).

Informational energy and likelihood.
In this section, the connection between likelihood function and the informational energy function is established.Consider the function given by Also, let D θ be the set of all compact sets A ⊂ Θ containing θ in their interior.Furthermore, we assume that for every θ ∈ Θ, there exists A ⊂ D θ such that for at least one n on the set energy rate is finite.Note that e n (A) ≤ e(θ) < ∞ for every θ ∈ A.
Clearly, e(A) is the informational energy about the unknown parameter in the set A.
It is clear that these results can be formulated to give the set entropy function.Consider the loglikelihood function then where H n (A) is the set entropy function defined for an open or compact set A ⊂ Θ, and , where H(A) is the uncertainty as to whether the unknown parameters are in the set A.

Test procedures based on informational energy.
In this section, statistical inference via informational energy function is developed.Estimates and test procedures are presented.Let X 11 ,X 12 ,...,X 1n i , be independent random samples with distribution functions F i , i = 1, 2, respectively.An estimate of the informational energy function e(F j ) = (f 2 j (x))dx proposed by Bhattacharyya and Roussas [3] is given by where where h j is a bandwidth, and K is a known symmetric and bounded function probability density function such that lim y→∞ yK(y) = 0. Ahmad [1] proposed the estimate Bhattacharyya and Roussas' estimate [3] is a special case of Ahmad's estimate [1], since where K (2) (y) is the convolution of K(y) with itself.See Ahmad and Kochar [2] for details.
A test statistics for testing H 0 : e(F 1 ) = e(F 2 ) is given by In the case h 1 = h 2 = h, f 1 = f is a fixed probability density function and g is a function such that f 2 = f + γg is a probability density function for sufficiently small |γ|, the α-level test rejects H 0 if T (F) = T (f ) > t f , where and t f is the α-level critical point of the distribution of T (F) under the null hypothesis Theorem 4.1.Let θ be the maximum likelihood estimator of θ.
as n → ∞, where I F (θ) is the Fisher information matrix.
Proof.By the asymptotic normality of √ n( θ −θ) and a Taylor's expansion of e(θ) around θ, we obtain the desired result.The results above can be used for statistical inference.Now consider the hypothesis, H 0 : e(θ) = e(θ 0 ) against H 0 : e(θ) > e(θ 0 ), where e(θ 0 ) is a specified value of the population informational energy.An appropriate test statistics for testing the hypothesis is given by The statistic T has in the limit the standard normal distribution so that T 2 has a chi-square distribution with one degree of freedom.
A size α-test will reject H 0 if T * > χ 2 1, 2α .This follows from the fact that if C > 0, and 1 if C ≤ 0, where χ 2 (1) denotes a random variable having a chi-square distribution with one degree of freedom.
A test of equality of several informational energies, that is, rejects where The statistic T 1 has in the limit as n = k i=1 n i goes to infinity the chi-square distribution with k − 1 degrees of freedom.Consequently, the null hypothesis is rejected at level α if T > χ 2 k−1; α .

Applications.
Let e(f ) and e(g) be the informational energies associated with the distribution functions F and G, respectively.In this section, we present some applications and some examples of the results presented in earlier sections.
Confidence intervals for e(θ) can be readily obtained and is given by and a nonconservative sample size for a prescribed error and a risk α is where σ 2 ( θ) is given in Section 4, where c α/2 is the critical point of the standard normal distribution at the significance level α/2 and [] the greatest integer function.
(1) Normal Distribution.The informational energy for the normal distribution is given by (5.3) Clearly, e(f ) is a bijective function of σ .If σ F and σ G are the standard deviations of the distribution functions F and G, respectively, then e(f ) ≥ e(g) if and only if The corresponding weighted pdf g( On applying Theorem 2.4, for β ≥ c ≥ 1, we obtain e(cf , g) ≤ e(cg, f ).
(3) The following result establishes the relation between informational energy and dispersive ordering of distributions.Let X and Y be two random variables with distribution functions F and G, respectively, and corresponding quantile functions F −1 and G −1 .The distribution function F is said to be less dispersive than G, (Parzen [8]) for 0 < v < u < 1.When F −1 and G −1 are differentiable, this definition is equivalent to (5.7) Consequently, F d < G implies that e(f ) ≥ e(g), whenever the densities exit.
6. Discussion.In this paper, inequalities and the use of the informational energy for statistical comparisons and inferences in terms of uncertainty of parameters and parameter sets is developed.For the purpose of comparisons an intuitive grasp of notions involving informational energy functions follows by noting that the scale parameters for the distributions are ordered.Non-parametric and parametric estimates are presented.See references therein.Procedures for testing for homogeneity of informational energy are obtained and implemented.
In the discrete setting where X and Y are random variables with joint probability distribution p ij , i = 1, 2,...,r , and j = 1, 2,...,c, the informational energy and mutual information of order γ concerning X and Y are given by e γ (X, Y ) = where p i+ and p +j are the marginal distributions of X and Y , respectively.Comparisons of these informational functions and statistical inference concerning parameters and parameter sets can be obtained for both discrete and continuous distributions.