Characterization problems in probability are studied here. Using the characteristic function of an additive convolution we generalize some known characterizations of the normal distribution to stable distributions. More precisely, if a distribution of a linear form depends only on the sum of powers of the certain parameters, then we obtain symmetric stable distributions.

1. Introduction

The original motivation for this paper comes from a desire to understand the results about characterization of normal distribution which were shown in [1]. In this paper, the author provides characterizations of the normal distribution using a certain invariance of the noncentral chi-square distribution. More precisely, let statistic ∑i=1naiXi+Y+Z have a distribution which depends only on ∑i=1nai2, with ai∈R, 1≤m<n, where (X1,…,Xm,Y) and (Xm+1,…,Xn,Z) are independent random vectors with all moments, and Xi are nondegenerate; then Xi are independent and have the same normal distribution with zero means and cov(Xi,Y)=cov(Xi,Z)=0 for i∈{1,…,n}. The proof of the above theorem is divided into two parts: first, it is proved that this result holds for two random variables. Second, it is shown using the properties of multidimensional normal distribution. The additional moment assumption is due to the fact that the author uses a method of cumulants. An alternative method of proof (more direct and straightforward one) allows us to weaken some of the technical assumptions used in the above references and generalize it to a symmetric stable distribution. The paper is organized as follows. In Section 2 we review basic facts about characteristic function. Next in Section 3 we state and prove the main results.

2. A Characteristic Function

In this paper we denote by μX(dx) a probability measure of random variable X. If X is a random variable defined on a probability space (Ω,Σ,P), then the expected value of X, denoted by E(X), is defined as the Lebesgue integral: (1)EX=∫ΩXωPdω.A characteristic function is simply the Fourier transform, in probabilistic language. The characteristic function of a probability measure μ on R is the function φ:R→C: (2)φμt=∫Rexpitxdμdx.When we speak of the characteristic function φX of a random variable X, we have the characteristic function φμX of its distribution μX in mind. Note, moreover, that (3)φμt=EexpitX.Apparently, it is not accidental that the characteristic function encodes the most important information about the associated random variables. The underlying reason may well reside in the following three important properties:

The Gaussian distribution N(μ,σ) has the characteristic function φ(t)=exp(iμt-σ2t/2).

The symmetric α-stable distribution has the characteristic function φ(t)=exp(-ctα), where α∈(0,2〉 and c>0. For the special cases of parameter α, we get

the upper bound α=2 corresponding to the normal distribution,

α=1 corresponding to the Cauchy distribution,

for α=0.5 the distribution reduces to a Lévy distribution.

Random variables X1,…,Xn are independent if and only if for all a1,…,an∈R the joint characteristic function (i.e., the linear combination of the Xi’s) satisfies(4)Eexpia1X1+⋯+ianXn=Eexpia1X1⋯EexpianXn.

3. The Characterization Theorem

The main result of this paper is the following characterization of the symmetric α-stable distribution in terms of independent random vectors.

Theorem 1.

Let X=(X1,…,Xm) and (Xm+1,…,Xn) be independent random vectors, where the distribution of Xi is not Dirac measure and let statistic ∑i=1naiXi+Z have a distribution which depends only on ∑i=1naiα, for all a1,…,an∈R, 1≤m<n, Z being an arbitrary random variable, and α∈(0,2〉. Then Xi are independent and have the same symmetric α-stable distribution. Additionally, if Z∈L2(Ω,Σ,P) and α=2 then cov(Xi,Z)=0 for i∈{1,…,n}.

Proof.

We write Θ=r(θ1,θ2,…,θn), where θ1α+θ2α+⋯+θnα=1. We focus on characteristic function φ∑i=1nrθiXi+Z(t) of ∑i=1nrθiXi+Z. Then for r>0 we have (5)φ∑i=1nrθiXi+Ztr=Eei∑i=1nrθiXi+Zt/r=Eeiθ1S1+θ2S2+⋯+θnXnt+iZt/r.Since the hypothesis implies that left side of (5) does not depend on Θ/r=θ=(θ1,θ2,…,θn), then (6)limr→+∞φ∑i=1nrθiXi+Ztr=Eeiθ1X1+θ2X2+⋯+θnXntdoes not depend on θ. In particular we have that the distribution of a statistic (7)∑i=1naiXi=∑i=1naiα1/α∑i=1nai∑i=1naiα1/αXidepends on ∑i=1naiα only. Let h(∑i=1naiq)=Eei∑i=1naiXi. Because of the independence of (X1,…,Xm) and (Xm+1,…,Xn) we may write(8)h∑i=1naiα=Eei∑i=1maiXiEei∑i=m+1naiXi.Evaluating (8) first when am+1=⋯=an=0 and then when a1=⋯=am=0, we get h∑i=1maiα=Eei∑i=1maiXi and h∑i=m+1naiα=Eei∑i=m+1naiXi, respectively. Substituting this into (8), we see(9)h∑i=1naiα=h∑i=1maiαh∑i=m+1naiα.

Note that h(u) is continuous in u∈[0,∞), which implies h(u)=ecu (see Aczel [2], page 31), and so we have h∑i=1naiα=ec(∑i=1naiα). Thus we have actually proved that X1,…,Xn have the same symmetric α-stable distribution. But since we know that the distributions of Xi are symmetric stable, the independence of random variables X1,…,Xn follows from the observation that (10)h∑i=1naiα=Eei∑i=1naiXi=ec∑i=1naiα=EexpiX1a1⋯EexpiXnan.

If α=2 and E(Z2)<∞ then cov(Xi,Z)<∞ and because of the independence of X1,…,Xn we may write (11)V∑i=1nai2=Var∑i=1naiXi+Z=2∑i=1naicovXi,Z+∑i=1nai2VarX1+VarZ,because cov(Xi,Xj)=0 for i≠j and Var(X1)=⋯=Var(Xn). This implies that linear combination ∑i=1naicov(Xi,Z) is constant on sphere {(a1,…,an):∑i=1nai2=1} which gives cov(Xi,Z)=0.

The above consideration gives us the following results of Ejsmont [1] and Cook [3], respectively.

Corollary 2 (the main result of Ejsmont [<xref ref-type="bibr" rid="B5">1</xref>]).

Let (X1,…, Xm,Y) and (Xm+1,…,Xn,Z) be independent random vectors with all moments, where Xi are nondegenerate, and let statistic ∑i=1naiXi+Y+Z have a distribution which depends only on ∑i=1nai2 for all ai∈R and 1≤m<n. Then Xi are independent and have the same normal distribution with zero means and cov(Xi,Y)=cov(Xi,Z)=0 for i∈{1,…,n}.

Corollary 3 (the main result of Cook [<xref ref-type="bibr" rid="B4">3</xref>]).

Let (X1,…,Xm) and (Xm+1,…,Xn) be independent random vectors, where Xi are nondegenerate, and let statistic ∑i=1n(Xi+ai)2 have a distribution which depends only on ∑i=1nai2, ai∈R and 1≤m<n. Then Xi are independent and have the same normal distribution with zero means.

Proof.

If we put Z=∑i=1nXi2 and α=2 in Theorem 1 then we get (12)∑i=1maiXi+∑i=m+1naiXi+Z=∑i=1nXi+ai22-14×∑i=1nai2.This means that the distribution of ∑i=1maiXi+∑i=m+1naiXi+Z depends only on ∑i=1nai2, which by Theorem 1 implies the statement.

Here we state the Herschel-Maxwell theorem in modern notation (see, e.g., [4] or [5]). This theorem can be also obtained from Theorem 1 by considering m=1 and n=2 as well as Z=0 and α=2 (the proof is left to the reader).

Theorem 4.

Let X, Y be independent random variables and a1,a2 real numbers such that a12+a22=1. Then X, Y are normally distributed with zero means if and only if a1X+a2Y is distributed identically as X for any (a1,a2)∈R2.

Open Problem. Kagan and Letac [6] formulate the following theorem: Let n be a fixed integer n≥3. Let X1,X2,…,Xn be independent identically distributed random variables. In the Euclidean space Rn consider the linear subspace E=1⊥, that is, the set {(a1,a2,…,an):a1+a2+⋯+an=0}. Then the following characterizations hold: If for all a∈E the distribution of the random variable (13)∑i=1nXi-X¯+ai2depends only on a2=a12+a22+⋯+an2, then Xi’s are normally distributed.

A key role in the proof of these results is played by Marcinkiewicz’ theorem: if Q(x) is a polynomial and exp(Q(x)) is the characteristic function of some probability distribution, then the degree of Q is less than or equal to two. Finally, we present the conjecture (Theorem 1 cannot be applied here).

Conjecture 5.

Let (X1,…,Xm) and (Xm+1,…,Xn) be independent random vectors, where Xi are nondegenerate, and the distribution of the random variable (14)∑i=1nXi-X¯+ai2depends only on a2=a12+a22+⋯+an2, where a=(a1,…,an)∈E, and then Xi’s are normally distributed independent random variables.

Competing Interests

The author declares having no competing interests.

Acknowledgments

The author would like to thank M. Bożejko for several discussions and helpful comments during the preparation of this paper. The work was partially supported by the Narodowe Centrum Nauki, Grant no. 2014/15/B/ST1/00064, and by the Austrian Science Fund (FWF) Project no. P 25510-N26.

EjsmontW.A characterization of the normal distribution by the independence of a pair of random vectorsAczelJ.CookL.A characterization of the normal distribution by the independence of a pair of random vectors and a property of the noncentral chi-square statisticBrycW.KaganA. M.LinnikY. V.RaoC. R.KaganA. M.LetacG.Characterization of the normal distribution through the power of a one-way ANOVA