JPSJournal of Probability and Statistics1687-95381687-952XHindawi Publishing Corporation89574210.1155/2009/895742895742Research ArticleOn Concordance Measures for Discrete Data and Dependence Properties of Poisson ModelBouezmarniTaoufik1MesfiouiMhamed2TajarAbdelouahid3MaChunsheng1Department of EconomicsMcGill UniversityLeacock Building855 Sherbrooke Street WestC.P. 6128succursale Centre-ville Montreal, QCCanadaH3A 2T7mcgill.ca2Département de Mathématiques et d'InformatiqueUniversitédu Québec à Trois-RivièresPavillon Ringuetlocal 3060C.P. 500Trois-RivièresQCCanadaG9A 5H7uquebec.ca3ARC Epidemiology UnitThe University of ManchesterOxford RoadManchester M13 9PTUKmanchester.ac.uk20091801200920090204200910112009151220092009Copyright © 2009This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

We study Kendall's tau and Spearman's rho concordance measures for discrete variables. We mainly provide their best bounds using positive dependence properties. These bounds are difficult to write down explicitly in general. Here, we give the explicit formula of the best bounds in a particular Fréchet space in order to understand the behavior of the ranges of these measures. Also, based on the empirical copula which is viewed as a discrete distribution, we propose a new estimator of the copula function. Finally, we give useful dependence properties of the bivariate Poisson distribution and show the relationship between parameters of the Poisson distribution and both tau and rho.

1. Introduction

The best known dependence property is “lack of dependence,” or what is known as stochastic independence. In many applications, independence between two random variables is assumed; this can be a strong assumption in the undertaken analysis. Taking into account the dependence structure between the variables leads to appropriate modeling approaches and correct conclusions. To study stochastic dependence, concordance concept and positive dependence are well used tools. This is because many dependence properties can be described by means of the joint distribution of the variables and these measures and properties are often margins free. In this paper we study two concordance measures, Kendall’s tau Kruskal  and Spearman’s rho Lehmann . These measures have several properties known as Rényi's axioms; for more details see Rényi . Among these axioms, we focus on the range of the association measure.

Many researches have been concerned with the study of tau and rho in the case of continuous variables. Schweizer and Wolff , in one seminal paper, show that the study of concordance measures for continuous random variables can be characterized as the study of copulas . However, for noncontinuous variables, this interrelationship generally does not hold. There are few papers concerning the discrete version of Kendall's tau and Spearman's rho. Conti  gives definitions of two approaches of indifference and links them to concordance and discordance properties of the data. Tajar et al.  propose a copula-type representation for random couples with binary margins. They show that appropriate measures of association for binary random variables do not depend on the marginal distribution of the variables under study. Mesfioui and Tajar  and Denuit and Lambert  have shown independently that the range of tau and rho in the discrete case is not the unit interval as in the continuous case. Nešlehovà  considers an alternative transformation of an arbitrary random variable to a uniform distribution variable in order to study the rank measures for noncontinuous random variables.

In this paper, we focus on the range of the concordance measures. Aside from identifying the best bounds of tau and rho in the case of discrete random variables, we present some dependence properties of the bivariate Poisson model and discuss their relationship with the concordance measures tau and rho. The paper is organized as follows. The next section provides a method of constructing the ranges of tau and rho for discrete data. Section 3 develops explicit expressions for the best bounds of tau and rho in the discrete Fréchet space with the same marginal. Section 4 provides a new estimator of the copulas based on the so-called empirical copulas. Section 5 discusses some dependence properties of the bivariate Poisson model.

2. Defintions and Properties

Following Hoeffding , Kruskal , and Lehmann , Schweizer and Wolff  express Kendall's tau and Spearman's rho for continuous random vector (X,Y) in terms of the joint distribution H(x,y) of (X,Y) and the margins F(x) for X and G(y) for Y. A general representation for each of τ and ρ has been first proposed by Kowalczyk and Niewiadomska-Bugaj ; namely

τ=EH[H(X,Y)]+EH[H(X-,Y-)]+EH[H(X-,Y)]+EH[H(X,Y-)]-1,ρ=3{EΠ[H(X-,Y)]+EΠ[H(X,Y-)]+EΠ[H(X-,Y-)]+EΠ[H(X,Y)]-1},   where H(x-,y-)=P[X<x,Y<y], H(x,y-)=P[Xx,Y<y], H(x-,y)=P[X<x,Yy], and Π(x,y)=F(x)G(y).

Several results in this paper are based on the monotonicity property of Kendall's τ and Spearman's ρ. This property has first been proposed for continuous variables by Yanagimoto and Okamoto  (see also ). Tchen  obtained similar monotonicity property for τ and ρ when the supports of the joint distributions consist in a finite number of atoms. Mesfioui and Tajar  extend various dependence relationships between Kendall's τ and Sperman's ρ in Capéraà and Genest  and Nelsen , to the discrete case. One key result of their paper is the generalization to any kind of random variables for continuous and/or discrete variables.

For the remainder of the paper, we recall the property of concordance orderings, defined as follows.

Let (X1,Y1) and (X2,Y2) be random vectors with identical marginals and respective cdf's H1 and H2. The random couple (X2,Y2) is said to be more concordant than (X1,Y1), denoted by (X1,Y1)c(X2,Y2), if H1(x1,x2)H2(x1,x2) holds for all x1,x2.

In the following proposition, we propose a flexible method to establish the monotonicity property given in Mesfioui and Tajar  for purely discrets random vectors. The proof is direct and easy to understand and extends the result to the general random vectors.

Proposition 2.1.

Let (X1,Y1) and (X2,Y2) be two random couples with respective distribution function H1 and H2 in Γ(F,G), the Fréchet space of all distribution functions with fixed marginals F and G. Then, (X1,Y1)c(X2,Y2)τH1τH2,(X1,Y1)c(X2,Y2)ρH1ρH2.

Proof.

Using Fubini's theorem, we note that EH1[H2(X,Y)]=EH2[H̅1(X-,Y-)],EH1[H2(X-,Y)]=EH2[H̅1(X,Y-)],EH1[H2(X,Y-)]=EH2[H̅1(X-,Y)],EH1[H2(X-,Y-)]=EH2[H̅1(X,Y)], where H̅i denotes the survival functions associated to Hi, i=1,2.

Now without loss of generality if we assume that H1H2, which is equivalent to H̅1H̅2, we then get EH1[H1(X,Y)]EH1[H2(X,Y)]=EH2[H̅1(X-,Y-)]EH2[H̅2(X-,Y-)]=EH2[H2(X,Y)]. Similarly, we obtain EH1[H1(X-,Y)]EH2[H2(X-,Y)],EH1[H1(X,Y-)]EH2[H2(X,Y-)],EH1[H1(X-,Y-)]EH2[H2(X-,Y-)]. Combining the later inequalities with (2.1), we then obtain (2.3). It is easy seen that (2.4) is immediate from (2.2).

For any bivariate distribution function H with univariate marginals F and G, one has

max[0,F(x)+G(y)-1]H(x,y)min[F(x),G(y)]. The extreme distributions Hmin(x,y)=max[0,F(x)+G(y)-1] and Hmax(x,y)=min[F(x),G(y)] are often refereed as Fréchet bounds (see ). These bounds play a central role to construct optimal ranges of τ and ρ as stated in the following corollary.

Corollary 2.2.

Let (X,Y) be a random couple with distribution function H in Γ(F,G). Then, τminτHτmax,ρminτHρmax, where τmin, ρmin and τmax, ρmax denote the values of Kendall's τ and Spearman's ρ corresponding to the Fréchet lower and upper bounds in Γ(F,G), respectively.

As stated earlier, the main objective in this paper is to examine the bounds of τ and ρ in the Fréchet space Γ(F,G) when F and G are discrete. To do that, let (X,Y) be a discrete random couple with cdf HΓ(F,G). Since Kendall's τ and Spearman's ρ are scale invariants, they remain unchanged under strictly increasing transformations of the marginal distributions. We can then suppose, without any loss of generality, that X and Y are valued in , the set of all integers. Therefore, we can see from (2.1) and (2.2) that τ and ρ can be written as

τ=i=-j=-TijSij-1,ρ=3i=-j=-Tij[F(i)-F(i-1)][G(j)-G(j-1)]-3, where

Tij=H(i,j)+H(i-1,j-1)+H(i,j-1)+H(i-1,j),Sij=H(i,j)-H(i,j-1)-H(i-1,j)+H(i-1,j-1).

In order to obtain the best bounds τmin, ρmin and τmax, ρmax, the minimum and maximum values corresponding to lower and upper bound of τ and ρ, respectively, we replace H in (2.10) and (2.11) by the Fréchet bounds Hmin(i,j)=max[F(i)+G(j)-1,0] and Hmax(i,j)=min[F(i),G(j)], respectively.

For discrete data, the ranges of τ and ρ are different from the usual unit interval [-1,1]. This is a violation of the monotone dependence properties of concordance measures, as stated in Nelsen . To correct this problem, we propose the following corrections:

τc={ττmaxifτ0,-ττminifτ<0,ρc={ρρmaxifρ0,-ρρminifρ<0. The main importance of these corrections is that they allow to interpret the levels of the new measures, τc and ρc, as percentages. Illustrations of these transformations are proposed in Section 5 with the bivariate Poisson distribution.

3. Explicit Bounds of Discrete <inline-formula><mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" id="M100"><mml:mrow><mml:mi>τ</mml:mi></mml:mrow></mml:math></inline-formula> and <inline-formula><mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" id="M101"><mml:mrow><mml:mi>ρ</mml:mi></mml:mrow></mml:math></inline-formula> in <inline-formula><mml:math xmlns:mml="http://www.w3.org/1998/Math/MathML" id="M102"><mml:mi mathvariant="normal">Γ</mml:mi><mml:mo stretchy="false">(</mml:mo><mml:mi>F</mml:mi><mml:mo>,</mml:mo><mml:mi>F</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>

The aim of this section is to study the effect of the marginal distributions on the range of τ and ρ for discrete data. Note that it is difficult to obtain explicit expressions of the extreme values of τ and ρ in Γ(F,G) for noncontinuous distribution F and G. This problem is very complicated and requires several assumptions on F and G. In order to analyze the behavior of these bounds, we consider the particular space Γ(F,F), where F is a discrete distribution function. To this end, consider the integer function defined by

ϕ(i)=min{j:F(i)+F(j)>1},i. This function plays an important role to explicit lower bounds of τ and ρ in the space Γ(F,F). The next proposition presents explicit optimal bounds of Spearman's ρ.

Proposition 3.1.

The best bounds for ρ in the space Γ(F,F) are given by ρmax=3E[1-F2(X)-F2(X-1)],ρmin=3E[ψ(X)+ψ(X-1)-1], where ψ(i)=j=ϕ(i)[F(i)+F(j)-1][F(j+1)-F(j-1)].

Proof.

Let H(i,j)=min[F(i),F(j)]. From (2.12), we observe that Tij=[F(i)+3F(i-1)]𝕀[i=j]+2[F(i)+F(i-1)]𝕀[i<j]+2[F(j)+F(j-1)]𝕀[i>j], and writing F(i)-F(i-1)=pi, we get from (2.11) that ρmax=3i=-[F(i)+3F(i-1)][F(i)-F(i-1)]pi+6i=-[F(i)+F(i-1)]pij=i+1[F(j)-F(j-1)]+6i=-pij=-i-1[F(j)-F(j-1)][F(j)+F(j-1)]-3, which may be simplified as ρmax=3E{[F(X)+3F(X-1)][F(X)-F(X-1)]}+6E{[F(X)+F(X-1)][1-F(X)]}+6E[F2(X-1)]-3. The result then follows from the fact that E[F(X)+F(X-1)]=1. Now, choose H(i,j)=sup[F(i)+F(j)-1,0] and put H+(i,j)=F(i)+F(j)-1. From (2.11), we see that ρmin=3i=-j=ϕ(i)H+(i,j)pipj+3i=-j=ϕ(i)H+(i,j)pi+1pj+1+3i=-j=ϕ(i)H+(i,j)pipj+1+3i=-j=ϕ(i)H+(i,j)pi+1pj-3. It follows that ρmin=3i=-(pi+pi+1)j=ϕ(i)[F(i)+F(j)-1](pj+pj+1)-3, which may be rewritten as ρmin=3i=-[ψ(i)+ψ(i-1)-1]pi, where ψ(i)=j=ϕ(i)[F(i)+F(j)-1][F(j+1)-F(j-1)]. The result is therefore obtained from (3.11) and (3.10).

Using (2.10) with H(i,j)=min[F(i),F(j)], we notice that the upper bound of Kendall's τ in the space Γ(F,F) can be expressed as

τmax=2E[F(X-1)]. Note that the sharp upper bound given in Denuit and Lambert  coincides with (3.12) in Γ(F,F). However, the behavior of Kendall's tau lower bound in terms of the distribution F is not evident. The following proposition gives an explicit form of this bound in Γ(F,F).

Proposition 3.2.

The best lower bounds of τ in Γ(F,F) is τmin=2E[F[ϕ(X-1)]]-2k=-ξ(k)-2, where ξ(k)=[F(k-1)+F[ϕ(k-1)]-1][F(k)+F[ϕ(k-1)-1]-1]𝕀[ϕ(k)<ϕ(k-1)].

Proof.

From (2.12) and (2.13), we observe that SijTij=H2(i,j)+H2(i-1,j-1)-H2(i-1,j)-H2(i,j-1)+2H(i,j)H(i-1,j-1)-2H(i-1,j)H(i,j-1)i=-j=-[H2(i,j)+H2(i-1,j-1)-H2(i-1,j)-H2(i,j-1)]=1. Consider now H(i,j)=sup[F(i)+F(j)-1,0] and write H+(i,j)=F(i)+F(j)-1. From (2.10), we get τmin=2i=-j=ϕ(i-1)+1H+(i,j)H+(i-1,j-1)-2i=-j=max[ϕ(i-1),ϕ(i)+1]H+(i-1,j)H+(i,j-1). Using the fact that H+(i,j)H+(i-1,j-1)-H+(i-1,j)H+(i,j-1)=-pipj, we have τmin=-2i=-pi[j=ϕ(i-1)+1pj]𝕀[ϕ(i)=ϕ(i-1)]-2i=-pi[j=ϕ(i-1)+1pj]𝕀[ϕ(i)<ϕ(i-1)]-2i=-[F(i-1)+F(ϕ(i-1))-1][F(i)+F(ϕ(i-1)-1)-1]𝕀[ϕ(i)<ϕ(i-1)], which is equivalent to τmin=-2i=-pi[1-F[ϕ(i-1)]]-2i=-r(i) with r(i)=[F(i-1)+F(ϕ(i-1))-1][F(i)+F(ϕ(i-1)-1)-1]𝕀[ϕ(i)<ϕ(i-1)], which completes the proof.

Remark 3.3.

Let Fn,p be a binomial distribution with parameters n and p, and denote the extreme values of τ and ρ in Γ(Fn,p,Fn,p) by τmax(n,p) and ρmax(n,p). One can show the following symmetry properties, namely: τmax(n,p)=τmax(n,1-p),ρmax(n,p)=ρmax(n,1-p). Indeed, since Fn,p(k)=Fn,1-p(n-k), then from (3.7), we have τmax(n,p)=2k=0nFn,p(k-1)[Fn,p(k)-Fn,p(k-1)]=2k=0nFn,1-p(k-1)[Fn,1-p(k)-Fn,1-p(k-1)]=τmax(n,1-p). Similar arguments provide ρmax(n,p)=ρmax(n,1-p).

In this section, we examine the symmetry of the ranges of τ and ρ associated to discretef data. In continuous case, it is well known that the ranges of these parameters are symmetric, that is, τmax=-τmin and ρmax=-ρmin. This conclusion is of course invalid for noncontinuous data. In order to clarify this question, we consider again the space Γ(F,F) with discrete distribution F. We present below a situation which ensures that ρmax=-ρmin and τmax=-τmin. As consequence of Propositions 3.1 and 3.2 and (3.12), one can establish the following results.

Corollary 3.4.

In space, Γ(F,F), if E[F2(X)]=E[ψ(X)] and E[F2(X-1)]=E[ψ(X-1)], then ρmax=-ρmin.

Corollary 3.5.

In space, Γ(F,F), if ϕ(i)=ϕ(i-1), i, and E[F(X)]=E[F(ϕ(X))], then τmax=-τmin.

4. Empirical Copulas Viewed as a Discrete Distribution

It is well recognized that copula provides a flexible approach to model the joint behavior of random variables. In fact, this method allows to represent a bivariate distribution as function of its univariate marginals through a linking function called a copula. Specifically, if H is a distribution function of a bivariate random vector (X,Y) with continuous marginals, then Sklar  ensures that there exists a unique copula C:[0,1]2[0,1] such that for all (x,y)2,

H(x,y)=C[F(x),G(y)]. Hence, C is a bivariate distribution function with uniform marginals on [0,1] that captures all the information about the dependence among the components of (X,Y). For a comprehensive introduction to a copula, the reader is referred to monographs by Nelsen .

Suppose that the random sample (X1,Y1),,(Xn,Yn) is given from some pair (X,Y) of continuous variable with copula C(u,v). To estimate the copula C, Deheuvels  proposes the so-called empirical copula defined by

Cn(u,v)=1ni=1n𝕀(Fn(Xi)u,Gn(Yi)v), where Fn and Gn are the empirical distribution functions of X and Y based on the sample X1,,Xn and Y1,,Yn given by

Fn(t)=1ni=1n𝕀(Xit),Gn(t)=1ni=1n𝕀(Yit). Let Ri be the rank of Xi among the sample X1,,Xn and Ti stands the rank of Yi among the sample Y1,,Yn. Observe that Cn is a function of ranks (R1,T1),,(Rn,Tn), because Fn(Xi)=Ri/n and Gn(Yi)=Ti/n, i=1,,n, namely,

Cn(u,v)=1ni=1n𝕀(Rinu,Tinv). From this representation, one can consider Cn(u,v) as a discrete bivariate distribution with uniform marginals taking values in the set {1/n,2/n,,1}. Observe that

Cn(u,v)=Cn(in,jn)for(u,v)[in,i+1n]×[jn,j+1n]. Now, one can observe that the Cn is not copula. Indeed, Cn(u,1)=nu/nu, where nu denotes the integer part of nu.

Our goal in this section is to transform the empirical copula in order to obtain a new estimator Cn* which is a copula. To this end, let (Zn,Wn) be a discrete random vector with distribution function Cn which is defined in (4.2). The idea is to transform the uniform discrete random variables Zn and Wn into a continuous variables Zn* and Wn* by defining

Zn*=Zn-Un,Wn*=Wn-Vn, where Un and Vn are independents and uniformly distributed in [0,1/n]. We also suppose that the random vectors Zn and Un (resp, Wn and Vn) are independents. The next result shows that the distribution function of the continuous version (Zn*,Wn*) is a copula.

Proposition 4.1.

The distribution function Cn* of the random vector (Zn*,Wn*) is a copula which may be expressed in terms of the empirical copula as follows: Cn*(u,v)=[1-nu+nu][1-nv+nv]Cn(nun,nvn)+[nu-nu][1-nv+nv]Cn(nu+1n,nvn)+[1-nu+nu][nv-nv]Cn(nun,nv+1n)+[nu-nu][nv-nv]Cn(nu+1n,nv+1n),u,v[0,1], where x is the integer part of x.

Proof.

For any u[i/n,(i+1)/n], i=0,,n-1, one sees from the definition of Zn* that P(Zn*u)=1nk=1nP(Unkn-u), and by using the fact that P(Unkn-u)=𝕀(ki)+(nu-i)𝕀(k=i+1), it follows that P(Zn*u)=u, which ensures that Zn* is uniformly distributed in [0,1]. Similar arguments imply that Wn* is also uniformly distributed in [0,1], so that Cn* is a copula.

Now, we show the expression of Cn* given in (4.7). Let (u,v) be in the set [i/n,(i+1)/n]×[j/n,(j+1)/n], i,j=0,,n-1. In view of relations (4.6), one has Cn*(u,v)=P(Zn*u,Wn*v)=k=1np=1nP(Unkn-u)P(Vnpn-v)P(Zn=k,Wn=p)=k=1np=1n[𝕀(ki)+(nu-i)𝕀(k=i+1)][𝕀(pj)+(nv-j)𝕀(p=j+1)]P(Zn=k,Wn=p). After simplifications, one observes Cn*(u,v)=Cn(in,jn)+(nu-i)[Cn(i+1n,jn)-Cn(in,jn)]+(nv-j)[Cn(in,j+1n)-Cn(in,jn)]+(nu-i)(nv-j)[Cn(i+1n,j+1n)-Cn(i+1n,jn)-Cn(in,j+1n)+Cn(in,jn)]. which can be rewritten as Cn*(u,v)=[1-nu+i][1-nv+j]Cn(in,jn)+[nu-i][1-nv+j]Cn(i+1n,jn)+[1-nu+i][nv-j]Cn(in,j+1n)+[nu-i][nv-j]Cn(i+1n,j+1n), and hence the result is obtained, since i=nu and j=nv.

Finally, one concludes that it will be convenient to estimate the theoretical copula C by using the proposal estimator Cn* instead of the empirical copula. The reason is that Cn* is a copula which uses all the points (i/n,j/n), (i/n,(j+1)/n), (i+1/n,j/n), and ((i+1)/n,(1+j)/n) in order to estimate C in [i/n,(i+1)/n]×[j/n,(j+1)/n].

5. Understanding Dependence Structure of the Bivariate Poisson Distribution

Our purpose in this section is to study dependence properties of the bivariate Poisson distribution H of a random couple (X,Y) and the relationship between τ and ρ and the parameters of H. Several bivariate Poisson distributions have been proposed in the statistical literature, for example, S. Kocherlakota and K. Kocherlakota . In applied statistics, however, the focus is on the trivariate reduction method described by Johnson et al.  who construct the Bivariate Poisson distribution using three independent random variables X1, X2, and Z all distributed as Poisson with parameters λ1, λ2, and α, respectively:

X=X1+Z,Y=X2+Z. The cumulative distribution of (X,Y) is given by

Hα,λ1,λ2(i,j)=k=0ijFλ1(i-k)Fλ2(j-k)αke-αk!, where Fλi denotes the cdf of Xi, i=1,2. We notice that X and Y are Poisson model with means λ1+α and λ2+α, respectively. Note that the covariance and the correlation between X and Y are expressed by

cov(X,Y)=α,corr(X,Y)=α(λ1+α)(λ2+α), which are positive and nondecreasing functions of α.

To study further the relationships between α and each of τ and ρ for the bivariate Poisson model, we propose an alternative parametrization which consists in fixing the marginal parameters α+λ1=m1 and α+λ2=m2. In this context, the cdf (5.2) becomes

Hα(i,j)=k=0ijFm1-α(i-k)Fm2-α(j-k)αke-αk!. As a consequence of the above representation, we can see {Hα} as a family of bivariate Poisson models with fixed marginals which are univariate Poisson models with parameters m1 and m2, respectively. This means that the set {Hα}, 0αm1m2 is included in the particular Fréchet space Γ(Fm1,Fm2), where Fmi denotes the cdf of a Poisson model with mean mi, i=1,2. The advantage of the parametrization (5.4) rather than (5.2) is that the coefficient α may be interpreted as a dependence parameter in the family {Hα}.

Now, let τα and ρα be Kendall's τ and Spearman's ρ associated with the distribution Hα. The result below provides the monotonicity of τα and ρα as functions of α.

Proposition 5.1.

Let Hα1 and Hα2 be two cdf of the set {Hα}. Then, α1α2Hα1Hα2, and consequently, α1α2τα1τα2,ρα1ρα2.

Proof.

From (5.4), Hα(i,j)α=k=0ijFm1-α(i-k)αFm2-α(j-k)αke-αk!+k=0ijFm1-α(i-k)Fm2-α(j-k)ααke-αk!+k=0ijFm1-α(i-k)Fm2-α(j-k)[αk-1e-α(k-1)!-αke-αk!], and using the fact that Fm1-α(i-k)α=Fm1-α(i-k)-Fm1-α(i-k-1),Fm2-α(j-k)α=Fm2-α(j-k)-Fm2-α(j-k-1),(5.7) becomes, upon simplifications, Hα(i,j)α=k=0ij(m1-α)i-ke-(m1-α)(i-k)!(m2-α)j-ke-(m2-α)(j-k)!αke-αk!0. Therefore (5.9) together with Proposition 2.1 provides (5.5) and (5.6).

Many statistical researches have focused on studying concepts of positive dependence for bivariate distributions, example right tail increasing, and positive quadrant dependence which are widely used in actuarial literature . There are natural relationships between dependence properties and measures of concordance. An interesting property of positive dependence is the concept of positive quadrant dependence (PQD) defined as follows: let (X,Y) be a random couple valued in × with joint cdf H, and marginals F and G. These random variables are said to be positively quadrant dependent if, and only if, for all (x,y)2

H(x,y)F(x)G(y). The following corollary is a direct consequence of the previous result.

Corollary 5.2.

The family {Hα} is positively quadrant dependent.

Proof.

Since Hα is a nondecreasing function of α, then H0Hα for all 0αm1m2. Now, from (5.4), H0(i,j)=Fm1(i)Fm2(j) for all i,j. Therefore the family {Hα} is PQD. Consequently, τα0, ρα0, and 3ταρα for all 0αm1m2.

Remark 5.3.

When m1=m2=m, the upper bound of the family {Hα} is given by the cdf Hm, and using (5.4), we then obtain that Hm(i,j)=Fm(ij)=min[Fm(i),Fm(j)], for all i,j, which is the upper Fréchet bound.

In order to appreciate the corrections of τ and ρ given by (2.14), we consider the family of Poisson model {Hα} with marginal parameters m1=m2=2. Using (3.2) and (3.12) with Fm instead of F, we obtain that ρmax=0.951 and τmax=0.792. Table 1 provides τα and ρα with their corrections τα,c and ρα,c for chosen values of α.

From Table 1, we note that the differences Dτ,α=τα,c-τα and Dρ,α=ρα,c-ρα are increasing as function of the dependence parameter α. This constatation is true in general because Dτ,α and Dρ,α can be expressed as

τα, ρα, τα,c and ρα,c for the Poisson model.

ατατα,cραρα,c
0.2 0.059 0.075 0.089 0.094
0.4 0.120 0.152 0.180 0.189
0.6 0.183 0.231 0.272 0.286
0.8 0.248 0.313 0.365 0.383
1.0 0.316 0.398 0.459 0.482
1.2 0.388 0.490 0.554 0.582
1.4 0.467 0.589 0.651 0.684
1.6 0.556 0.701 0.749 0.787
1.8 0.660 0.832 0.849 0.892

Dτ,α=(1-τmax)τατmax,Dρ,α=(1-ρmax)ραρmax, which shows that these parameters are in fact increasing with α.

Acknowledgment

The second author acknowledges the financial support of the Natural Sciences and Engineering Research Council of Canada.

KruskalW. H.Ordinal measures of associationJournal of the American Statistical Association195853814861MR010094110.2307/2281954ZBL0087.15403LehmannE. L.Some concepts of dependenceAnnals of Mathematical Statistics19663711371153MR020222810.1214/aoms/1177699260ZBL0146.40601RényiA.On measures of dependenceActa Mathematica Academiae Scientiarum Hungaricae195910441451MR0115203ZBL0091.14403SchweizerB.WolffE. F.On nonparametric measures of dependence for random variablesThe Annals of Statistics198194879885MR61929110.1214/aos/1176345528ZBL0468.62012NelsenR. B.An Introduction to Copulas20062ndNew York, NY, USASpringerxiv+269Springer Series in StatisticsMR2197664ContiP. L.On some descriptive aspects of measures of monotone dependenceMetron1993513-44360MR1337596ZBL0828.62050TajarA.DenuitM.LambertPh.Copula-type representation for random couples with Bernoulli marginsDiscussion paper20010118Leuven, BelgiumInstitute of Statisitcs, U.C.L.MesfiouiM.TajarA.On the properties of some nonparametric concordance measures in the discrete caseJournal of Nonparametric Statistics200517554155410.1080/10485250500038967MR2141361ZBL1135.60303DenuitM.LambertP.Constraints on concordance measures in bivariate discrete dataJournal of Multivariate Analysis2005931405710.1016/j.jmva.2004.01.004MR2119763ZBL1095.62065NešlehováJ.On rank correlation measures for non-continuous random variablesJournal of Multivariate Analysis200798354456710.1016/j.jmva.2005.11.007MR2293014ZBL1107.62047HoeffdingW.Masstabinvariante korrelationstheorieSchriften der Mathematischen Instituts und des Instituts für Angewandte Mathematik der Universitat Berlin19405179233KowalczykT.Niewiadomska-BugajM.Grade correspondence analysis based on Kendall's tauProceedings of the Conference of the International Federation of Classification Societes (IFCS '98)July 1998Rome, Italy182185YanagimotoT.OkamotoM.Partial orderings of permutations and monotonicity of a rank correlation statisticAnnals of the Institute of Statistical Mathematics196921489506MR025820910.1007/BF02532273ZBL0208.44704JoeH.Multivariate Models and Dependence Concepts199773London, UKChapman & Hallxviii+399Monographs on Statistics and Applied ProbabilityMR1462613TchenA. H.Inequalities for distributions with given marginalsThe Annals of Probability198084814827MR57731810.1214/aop/1176994668ZBL0459.62010CapéraàP.GenestC.Spearman's ρ is larger than Kendall's τ for positively dependent random variablesJournal of Nonparametric Statistics199322183194MR125638110.1080/10485259308832551FréchetM.Sur les tableaux de corrélation dont les marges sont donnéesAnnales de l'Universitéde Lyon A1951145377MR0049518ZBL0045.22905SklarM.Fonctions de répartition à n dimensions et leurs margesPublications de l'Institue de Statistique de l'Université de Paris19598229231MR0125600DeheuvelsP.La fonction de dépendance empirique et ses propriétés. Un test non paramétrique d'indépendanceBulletin de la Classe des Sciences. Académie Royale de Belgique1979656274292MR573609ZBL0422.62037KocherlakotaS.KocherlakotaK.Bivariate Discrete Distributions1992132New York, NY, USAMarcel Dekkerxvi+361Statistics: Textbooks and MonographsMR1169465JohnsonN. L.KotzS.BalakrishnanN.Discrete Multivariate Distributions1997New York, NY, USAJohn Wiley & Sonsxxii+299Wiley Series in Probability and Statistics: Applied Probability and StatisticsMR1429617DhaeneJ.GoovaertsM. J.Dependency of risks and loss ordersAstin Bulletin199626201212