A Characterization of Symmetric Stable Distributions

The original motivation for this paper comes from a desire to understand the results about characterization of normal distribution which were shown in [1]. In this paper, the author provides characterizations of the normal distribution using a certain invariance of the noncentral chi-square distribution.More precisely, let statistic∑n i=1 a i X i +Y+Z have a distribution which depends only on ∑n i=1 a2 i , with a i ∈ R, 1 ≤ m < n, where (X 1 , . . . , X m , Y) and (X m+1 , . . . , X n , Z) are independent random vectors with all moments, and X i


Introduction
The original motivation for this paper comes from a desire to understand the results about characterization of normal distribution which were shown in [1].In this paper, the author provides characterizations of the normal distribution using a certain invariance of the noncentral chi-square distribution.More precisely, let statistic ∑  =1     ++ have a distribution which depends only on ∑  =1  2  , with   ∈ R, 1 ≤  < , where ( 1 , . . .,   , ) and ( +1 , . . .,   , ) are independent random vectors with all moments, and   are nondegenerate; then   are independent and have the same normal distribution with zero means and cov(  , ) = cov(  , ) = 0 for  ∈ {1, . . ., }.The proof of the above theorem is divided into two parts: first, it is proved that this result holds for two random variables.Second, it is shown using the properties of multidimensional normal distribution.The additional moment assumption is due to the fact that the author uses a method of cumulants.An alternative method of proof (more direct and straightforward one) allows us to weaken some of the technical assumptions used in the above references and generalize it to a symmetric stable distribution.The paper is organized as follows.In Section 2 we review basic facts about characteristic function.Next in Section 3 we state and prove the main results.

A Characteristic Function
In this paper we denote by   () a probability measure of random variable .If  is a random variable defined on a probability space (Ω, Σ, ), then the expected value of , denoted by (), is defined as the Lebesgue integral: A characteristic function is simply the Fourier transform, in probabilistic language.The characteristic function of a probability measure  on R is the function  : R → C: When we speak of the characteristic function   of a random variable , we have the characteristic function    of its distribution   in mind.Note, moreover, that Apparently, it is not accidental that the characteristic function encodes the most important information about the associated random variables.The underlying reason may well reside in the following three important properties: (i) The Gaussian distribution (, ) has the characteristic function () = exp( −  2 /2).

The Characterization Theorem
The main result of this paper is the following characterization of the symmetric -stable distribution in terms of independent random vectors.
The above consideration gives us the following results of Ejsmont [1] and Cook [3], respectively.
Corollary 3 (the main result of Cook [3]).Let ( Here we state the Herschel-Maxwell theorem in modern notation (see, e.g., [4] or [5]).This theorem can be also obtained from Theorem 1 by considering  = 1 and  = 2 as well as  = 0 and  = 2 (the proof is left to the reader).Theorem 4. Let ,  be independent random variables and  1 ,  2 real numbers such that  2 1 +  A key role in the proof of these results is played by Marcinkiewicz' theorem: if () is a polynomial and exp(()) is the characteristic function of some probability distribution, then the degree of  is less than or equal to two.Finally, we present the conjecture (Theorem 1 cannot be applied here).