JPS Journal of Probability and Statistics 1687-9538 1687-952X Hindawi Publishing Corporation 10.1155/2015/159710 159710 Research Article Residual and Past Entropy for Concomitants of Ordered Random Variables of Morgenstern Family Mohie EL-Din M. M. 1 Amein M. M. 1, 2 Ali Nahed S. A. 3 Mohamed M. S. 3 Ma Chunsheng 1 Department of Mathematics Faculty of Science Al-Azhar University Cairo 11884 Egypt azhar.edu.eg 2 Department of Mathematics and Statistics Faculty of Science Taif University Hawia 888 Saudi Arabia tu.edu.sa 3 Department of Mathematics Faculty of Education Ain Shams University Cairo 11341 Egypt asu.edu.eg 2015 2792015 2015 25 06 2015 07 09 2015 2792015 2015 Copyright © 2015 M. M. Mohie EL-Din et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

For a system, which is observed at time t, the residual and past entropies measure the uncertainty about the remaining and the past life of the distribution, respectively. In this paper, we have presented the residual and past entropy of Morgenstern family based on the concomitants of the different types of generalized order statistics (gos) and give the linear transformation of such model. Characterization results for these dynamic entropies for concomitants of ordered random variables have been considered.

1. Introduction

The Morgenstern family discussed in  provides a flexible family that can be used in such contexts, which is specified by the distribution function (df) and the probability density function (pdf), respectively, as follows: (1) F X , Y x , y = F X x F Y y 1 + α 1 - F X x 1 - F Y y , (2) f X , Y x , y = f X x f Y y 1 + α 2 F X x - 1 2 F Y y - 1 , where - 1 α 1 , and f X ( x ) , f Y ( y ) and F Y ( y ) , F Y ( y ) are the marginal pdf and df of X and Y , respectively. The parameter α is known as the dependence parameter of the random variables X and Y . If α is zero, then X and Y are independent.

The general theory of concomitants of order statistics has originally been studied by David et al. . Let ( X i , Y i ) , i = 1,2 , , n , be n pairs of independent random variables from some bivariate population with df F ( x , y ) . Let X ( r : n ) be the r th order statistics; then the Y value associated with X ( r : n ) is called the concomitant of the r th order statistics and is denoted by Y [ r : n ] . The pdf of Y [ r : n ] is given by (3) g r : n y = g Y r : n y = - f Y X y x f r : n x d x , where f ( r : n ) ( x ) is the pdf of X ( r : n ) .

The concept of gos was introduced by Kamps  and we refer to it as case-I of gos; Kamps and Cramer  have introduced a second model of gos which we refer to as case-II of gos. The concept of lower gos was given by Pawlas and Szynal , and later Burkschat et al.  introduced it as dual generalized order statistics (dgos) to enable a common approach to descendingly ordered random variables like reversed order statistics and lower records models.

For the Morgenstern family with pdf given by (2), the pdf of the concomitant of case-I of gos Y [ r , n , m , k ] , 1 r n , is given by Beg and Ahsanullah  as follows: (4) g r , n , m , k y = f Y y 1 + α C r , n , m , k 2 F Y y - 1 , and the pdf of the concomitant of case-I of dgos Y d [ r , n , m , k ] , 1 r n , is given by Nayabuddin  as follows: (5) g d r , n , m , k y = f Y y 1 - α C r , n , m , k 2 F Y y - 1 , where C ( r , n , m , k ) = 1 - 2 j = 1 r γ j / i = 1 r ( γ i + 1 ) , γ r = k + ( n - r ) ( m + 1 ) , n N , k 1 , and m 1 = = m n - 1 = m R . The pdf of the concomitant of case-II of gos Y [ r , n , m ~ , k ] , 1 r n , is given by Mohie El-Din et al.  as follows: (6) g r , n , m ~ , k y = f Y y 1 + α D r , n , m ~ , k 2 F Y y - 1 , and the pdf of the concomitant of case-II of dgos Y d [ r , n , m ~ , k ] , 1 r n , is given by (7) g d r , n , m ~ , k y = f Y y 1 - α D r , n , m ~ , k 2 F Y y - 1 , where D [ r , n , m ~ , k ] = [ 1 - 2 c r - 1 i = 1 r a i ( r ) / γ i + 1 ] , a i ( r ) = j = 1 , i j r 1 / γ j - γ i , γ j γ i , 1 i r n , c r - 1 = j = 1 r γ j , 1 r n .

Let X be an absolutely continuous nonnegative random variable having df F ( t ) = P ( X t ) and the survival function F ¯ ( t ) = P ( X t ) . Suppose X denotes the lifetime of a component/system or of a living organism and f ( t ) = F ( t ) denotes the lifetime density function. In information theory, Shannon entropy  plays a vital role in measuring the index of dispersion, volatility, or uncertainty associated with a random variable X . Shannon entropy for a nonnegative continuous random variable X is given by (8) H X = - E ln f X X = - 0 f X x ln f X x d x . Ebrahimi  has defined the uncertainty of residual lifetime distributions H ( X ; t ) by truncating the distributions below some point t of a component as follows: (9) H X ; t = - t f X x F ¯ X t ln f X x F ¯ X t d x = ln F ¯ X t - 1 F ¯ X t t f X x ln f X x d x = 1 - 1 F ¯ X t t f X x ln λ X x d x , where λ X ( x ) is the failure rate or hazard rate defined by f X ( x ) / F ¯ X ( x ) . Clearly for t = 0 , H ( X ; 0 ) represents the Shannon uncertainty contained in X .

Di Crescenzo and Longobardi  have introduced past entropy over ( 0 , t ) , since it is reasonable to presume that in many realistic situations uncertainty is not necessarily related to the future but can also refer to the past. They have also shown the necessity of past entropy and its relation with the residual entropy. If X denotes the lifetime of an item or of a living organism, then past entropy (or uncertainty of lifetime distribution) of an item is defined as (10) H ¯ X ; t = - 0 t f X x F X t ln f X x F X t d x = ln F X t - 1 F X t 0 t f X x ln f X x d x = 1 - 1 F X t 0 t f X x ln τ X x d x , where τ X ( x ) is the reversed hazard rate of X given by f X ( x ) / F X ( x ) . In this paper, we will obtain and study the residual and past entropy of the Morgenstern family for concomitants of ordered random variables. We will also consider the characterization results based on the entropy function for concomitants of ordered random variables based on residual lifetime distribution and the past life distribution. The organization of the paper is as follows. In Section 2, we obtain the distribution function for concomitants of ordered random variables of the Morgenstern family. In Section 3, we obtain the residual and past entropy of our model and study the linear transformation and the upper bound for concomitants of ordered random variables. Some characterization results are presented in Section 4. Finally, some conclusions and comments are given in Section 5.

2. Distribution Function for Concomitants of Ordered Random Variables

Let ( X i , Y i ) , i = 1,2 , , n , be n pairs of independent continuous nonnegative random variables from some bivariate population with df F ( x , y ) . From (1), the conditional distribution function of Y given X = x is given by (11) F Y X y x = f X - 1 x F X , Y x , y x = F Y y 1 + α 1 - 2 F X x 1 - F Y y . For the Morgenstern family with conditional distribution function given by (11), the df of the concomitant of case-I of gos Y [ r , n , m , k ] , 1 r n , is given by (12) G r , n , m , k y = 0 F Y X y x f r , n , m , k x d x = F Y y 1 - α C r , n , m , k 1 - F Y y , where f ( r , n , m , k ) ( x ) is the pdf of case-I of gos X [ r , n , m , k ] (see Kamps ), the df of the concomitant of case-I of dgos Y d [ r , n , m , k ] , 1 r n , is given by (13) G d r , n , m , k y = 0 F Y X y x f d r , n , m , k x d x = F Y y 1 + α C r , n , m , k 1 - F Y y , where f d ( r , n , m , k ) ( x ) is the pdf of case-I of dgos X d [ r , n , m , k ] (see Burkschat et al. ), the df of the concomitant of case-II of gos Y [ r , n , m ~ , k ] , 1 r n , is given by (14) G r , n , m ~ , k y = 0 F Y X y x f r , n , m ~ , k x d x = F Y y 1 - α D r , n , m ~ , k 1 - F Y y , where f ( r , n , m ~ , k ) ( x ) is the pdf of case-II of gos X [ r , n , m ~ , k ] (see Kamps and Cramer ), and the df of the concomitant of case-II of dgos Y d [ r , n , m ~ , k ] , 1 r n , is given by (15) G d r , n , m ~ , k y = 0 F Y X y x f d r , n , m ~ , k x d x = F Y y 1 + α D r , n , m ~ , k 1 - F Y y , where f d ( r , n , m ~ , k ) ( x ) is the pdf of case-II of dgos X d [ r , n , m ~ , k ] (see Kamps and Cramer  and Burkschat et al. ).

Equations (4) to (7) and (12) to (15) can be combined as follows: (16) g Y r y = f Y y 1 + α M r 2 F Y y - 1 , (17) G Y r y = f Y y 1 + α M r 1 - F Y y , where (18) Y r = Y r , n , m , k , f o r c a s e - I o f g o s Y d r , n , m , k , f o r c a s e - I o f d g o s Y r , n , m ~ , k , f o r c a s e - I I o f g o s Y d r , n , m ~ , k , f o r c a s e - I I o f d g o s , (19) M r = C r , n , m , k , f o r c a s e - I o f g o s - C r , n , m , k , f o r c a s e - I o f d g o s D r , n , m ~ , k , f o r c a s e - I I o f g o s - D r , n , m ~ , k , f o r c a s e - I I o f d g o s . In the following section we will use the last equations to obtain and study the residual and past entropy of concomitants for Morgenstern family based on the types of gos.

3. Residual and Past Entropy

Noting that G ¯ Y r ( t ) = 1 - G Y r ( t ) and G Y r ( t ) / t = g Y r ( t ) , then, from (16), (17), and (9), the residual entropy for concomitants of ordered random variables of Morgenstern family is given by the following theorem.

Theorem 1.

For any absolutely continuous random variable Y r , which is the concomitant of r th ordered random variable of Morgenstern family defined in (18). From (9), Y r has a residual entropy H ( Y r ; t ) iff (20) H Y r ; t = ln G ¯ Y r t - 1 G ¯ Y r t 1 - α M r F ¯ Y t ln F ¯ Y t - H Y ; t + 2 α M r ϕ f y ; t + K 1 r , t , α , n , m , k , where (21) K 1 r , t , α , n , m , k = 1 2 α M r - 1 4 1 + α M r 2 - 1 + α M r 2 F Y t - 1 2 + 1 2 1 + α M r 2 ln 1 + α M r - 1 + α M r 2 F Y t - 1 2 l n 1 + α M r 2 F Y t - 1 , ϕ f y ; t = t F Y y f Y y ln f Y y d y , 1 r n , α 0 , - 1 α 1 .

From (16), (17), and (10), the past entropy for concomitants of ordered random variables of Morgenstern family is given by the following theorem.

Theorem 2.

For any absolutely continuous random variable Y r , which is the concomitant of r th ordered random variable of Morgenstern family defined in (18). From (10), Y r has a past entropy H ¯ ( Y r ; t ) iff (22) H ¯ Y r ; t = ln G Y r t - 1 G Y r t 1 - α M r F Y t ln F Y t - H ¯ Y ; t + 2 α M r ϕ ¯ f y ; t + K 2 r , t , α , n , m , k , where (23) K 2 r , t , α , n , m , k = 1 2 α M r - 1 4 1 + α M r 2 F Y t - 1 2 - 1 - α M r 2 + 1 2 1 + α M r 2 F Y t - 1 2 ln 1 + α M r 2 F Y t - 1 - 1 - α M r 2 ln 1 - α M r , ϕ ¯ f y ; t = 0 t F Y y f Y y l n f Y y d y , 1 r n , α 0 , - 1 α 1 .

The following proposition gives the values of the functions H ( Y r ; t ) and H ¯ ( Y r ; t ) under linear transformation.

Proposition 3.

For any absolutely continuous random variable Y r , which is the concomitant of r th ordered random variable of Morgenstern family defined in (18), define Z r = a Y r + b , where a > 0 and b 0 are constants. Then, for t > b ,

(1)

(24) H Z r ; t = ln G ¯ Y r t - b a - 1 G ¯ Y r t - b / a 1 - α M r F ¯ Y t - b a ln F ¯ Y t - b a - H Y ; t - b a - F ¯ Y t - b a ln a + 2 α M r ϕ f y ; t - b a - 1 - F Y 2 t - b a ln a 2 + K 1 r , t - b a , α , n , m , k ,

(2)

(25) H ¯ Z r ; t = ln G Y r t - b a - 1 G Y r t - b / a 1 - α M r F Y t - b a ln F Y t - b a - H ¯ Y ; t - b a - F Y t - b a ln a + 2 α M r ϕ ¯ f y ; t - b a - F Y 2 t - b a ln a 2 + K 2 r , t - b a , α , n , m , k .

3.1. Upper Bound for Residual and Past Entropy

We derive the upper bound for the residual and past entropy under the condition that the pdf for Y r , defined in (18), is less than 1. Note that (26) H Y r ; t = - t g Y r y G ¯ Y r t ln g Y r y G ¯ Y r t d y = ln G ¯ Y r t - 1 G ¯ Y r t t g Y r y ln g Y r y d y . We know that, for t > 0 , ln G ¯ Y r ( t ) 0 . Using this we get (27) H Y r ; t - 1 G ¯ Y r t t g Y r y ln g Y r y d y - 1 G ¯ Y r t 0 g Y r y ln g Y r y d y . Hence, (28) H Y r ; t H Y r G ¯ Y r t , with equality in which when t 0 , H ( Y r ) is the Shannon entropy based on the concomitant of ordered random variables.

Next, we calculate an upper bound for the past entropy. We have (29) H ¯ Y r ; t = - 0 t g Y r y G Y r t ln g Y r y G Y r t d y = ln G Y r t - 1 G Y r t 0 t g Y r y ln g Y r y d y . We know that, for t > 0 , ln G Y r ( t ) 0 . Using this we get (30) H ¯ Y r ; t - 1 G Y r t 0 t g Y r y ln g Y r y d y - 1 G Y r t 0 g Y r y ln g Y r y d y . Hence, (31) H ¯ Y r ; t H Y r G Y r t , with equality in which when t 0 , H ( Y r ) is the Shannon entropy based on the concomitant of ordered random variables.

4. Some Characterization Results

Gupta et al.  have studied characterizations of residual and past entropy of order statistics. Here, in this section, we present some characterization results based on residual and past entropy of the concomitant of ordered random variables.

Consider a problem of finding sufficient condition for the uniqueness of the solution of the initial value problem (IVP): (32) d y d x = f x , y , y x 0 = y 0 , where f is a given function of two variables whose domain is a region D R 2 , ( x 0 , y 0 ) is a specified point in D , and y is the unknown function. By the solution of the IVP on an interval I R , we mean a function ϕ ( x ) such that (i) ϕ is differentiable on I , (ii) the growth of ϕ lies in D , (iii) ϕ ( x 0 ) = y 0 , and (iv) ϕ ( x ) = f ( x , ϕ ( x 0 ) ) , for all x I . The following theorem together with other results will help in proving our characterization result.

Theorem 4.

Let the function f be defined and continuous in a domain D R 2 , and let f satisfy a Lipschitz condition (with respect to y ) in D ; namely, (33) f x , y 1 - f x , y 2 k y 1 - y 2 , k > 0 , for every point ( x , y 1 ) and ( x , y 2 ) in D . Then the function y = ϕ ( x ) satisfies the initial value problem y = f ( x , y ) and ϕ ( x 0 ) = y 0 , x I , is unique.

Proof (see Gupta and Kirmani [<xref ref-type="bibr" rid="B13">14</xref>]).

For any function f ( x , y ) of two variables defined in D R 2 , we now present a sufficient condition which guarantees that the Lipschitz condition is satisfied in D .

Lemma 5.

Suppose that the function f is continuous in a convex region D R 2 . Suppose further that f / y exists and is continuous in D . The function f satisfies Lipschitz condition in D .

Proof (see Gupta and Kirmani [<xref ref-type="bibr" rid="B13">14</xref>]).

We now present our two characterization results.

Theorem 6.

Let Y r be a nonnegative continuous random variable, defined in (18), with distribution function G Y r ( · ) defined in (17). Let the residual entropy of the concomitant of r th ordered random variable be denoted by H ( Y r ; t ) < , t 0 . Then H ( Y r ; t ) characterizes the distribution.

Proof.

Suppose there exist two functions G Y r 1 and G Y r 2 such that (34) H Y r 1 ; t = H Y r 2 ; t , for all t > 0 . Then (35) H Y r i ; t = λ Y r i t H Y r i ; t - 1 + λ Y r i t , i = 1,2 , where λ Y r i ( t ) = g Y r i ( t ) / G ¯ Y r i ( t ) is the hazard rate of the concomitant of r th ordered random variable, for i = 1,2 . Differentiating the above equation with respect to t and simplifying, we get (36) λ Y r i t = λ Y r i t λ Y r i t + H Y r i ; t H Y r i ; t - λ Y r i t H Y r i ; t , i = 1,2 . Suppose now that (37) H Y r 1 ; t = H Y r 2 ; t = g t . Then, for all t 0 , (38) λ Y r i t = ψ t , λ Y r i t , for i = 1,2 , where (39) ψ t , y = y y + g t g t - y g t . It follows from Theorem 4 and Lemma 5 that λ Y r 1 ( t ) = λ Y r 2 ( t ) , which prove the characterization result.

Theorem 7.

Let Y r be a nonnegative continuous random variable, defined in (18), with distribution function G Y r ( · ) defined in (17). Let the past entropy of the concomitant of r th ordered random variable be denoted by H ¯ ( Y r ; t ) < , t 0 . Then H ¯ ( Y r ; t ) characterizes the distribution.

Proof.

By the same way of the last theorem, where τ Y r i ( t ) = g Y r i ( t ) / G Y r i ( t ) is the reversed hazard rate of the concomitant of r th ordered random variable, for i = 1,2 .

Next we present a characterization result of the linear mean residual family of the concomitant of r th ordered random variable based on Theorem 6. The linear mean residual life of the concomitant of r th ordered random variable is given by μ Y r ( t ) = a + b t , a > 0 , b > - 1 . It can be verified that the corresponding failure rate is given by λ Y r ( t ) = 1 + b / a + b t . The linear mean residual family of a distribution has been studied among others by Hall and Wellner , Oakes and Dasu , and Gupta and Kirmani . It includes the exponential distribution for b = 0 and the power distribution for - 1 < b < 0 . Gupta et al.  have studied the linear mean residual life of a distribution based on order statistics.

Theorem 8.

Let Y r be a nonnegative continuous random variable, defined in (18). Then H ( Y r ; t ) = 1 + b / 1 + b - ln λ Y r ( t ) if and only if λ Y r ( t ) = 1 + b / a + b t .

Proof.

From (9), we have (40) H Y r ; t = 1 - 1 G ¯ Y r t t g Y r y ln λ Y r y d y = 1 - ln 1 + b + 1 G ¯ Y r t t g Y r y ln a + b y d y . To find t g Y r y ln ( a + b y ) d y , first, we want to find (41) T l , t = t g Y r y a + b y l d y = t G ¯ Y r y 1 + b a + b y a + b y l d y = 1 + b b + 1 - b l a + b t l G ¯ Y r t . Taking the derivative of the above equation with respect to l and evaluating at l = 0 , we get t g Y r ( y ) l n ( a + b y ) d y = b / 1 + b + l n ( a + b t ) G ¯ Y r ( t ) . Substituting the pervious results into (40) we get (42) H Y r ; t = 1 + b 1 + b - l n 1 + b a + b t = 1 + b 1 + b - ln λ Y r t . To prove the converse, (35) gives (43) H Y r ; t = λ Y r t H Y r ; t - 1 + λ Y r t = λ Y r t 1 + b 1 + b - l n λ Y r t - 1 + λ Y r t = λ Y r t b 1 + b . Since H ( Y r ; t ) = - λ Y r ( t ) / λ Y r ( t ) , then we get (44) λ Y r t + b 1 + b λ Y r 2 t = 0 , whose solution is (45) λ Y r t = 1 + b a + b t .