Convergence Rates in the Strong Law of Large Numbers for Martingale Difference Sequences

and Applied Analysis 3 Definition 1.2. A real-valued function l x , positive and measurable on 0,∞ , is said to be slowly varying if lim x→∞ l xλ l x 1 1.3 for each λ > 0. Definition 1.3. A sequence {Xn, n ≥ 1} of random variables is said to be stochastically dominated by a random variable X if there exists a positive constant C, such that P |Xn| > x ≤ CP |X| > x 1.4 for all x ≥ 0 and n ≥ 1. Our main results are as follows. Theorem 1.4. Let α > 1/2, p > 1 and let αp ≥ 1. Let {Xn,Fn, n ≥ 1} be a martingale difference sequence, which is stochastically dominated by a random variable X. Let l x > 0 be a slowly varying function as x → ∞. Supposing that supi≥1E X2 i | Fi−1 ≤ C a.s. if p ≥ 2 and E|X|l ( |X| ) < ∞, 1.5 then for any ε > 0, ∞ ∑ n 1 nαp−2l n P ( max 1≤j≤n ∣ ∣Sj ∣ ∣ ≥ εn ) < ∞. 1.6 Theorem 1.5. Let α > 1/2, p > 1 and let αp > 1. Let {Xn,Fn, n ≥ 1} be a martingale difference sequence, which is stochastically dominated by a random variable X. Let l x > 0 be a slowly varying function as x → ∞. Supposing that supi≥1E |Xi| | Fi−1 ≤ C a.s. if p ≥ 2 and 1.5 holds, then for any ε > 0, ∞ ∑ n 1 nαp−2l n P ( sup j≥n ∣ ∣ ∣ ∣ Sj jα ∣ ∣ ∣ ∣ ≥ ε ) < ∞. 1.7 For p 1 and l x 1, we have the following theorem. Theorem 1.6. Let α ≥ 1, and let {Xn,Fn, n ≥ 1} be a martingale difference sequence, which is stochastically dominated by a random variable X. Supposing that E|X|ln |X| < ∞, 1.8 then for any ε > 0, ∞ ∑ n 1 nα−2P ( max 1≤j≤n ∣ ∣Sj ∣ ∣ ≥ εn ) < ∞. 1.9 4 Abstract and Applied Analysis The following theorem presents the complete moment convergence for martingale difference sequence. Theorem 1.7. Letting the conditions of Theorem 1.4 hold, then for any ε > 0, ∞ ∑ n 1 nαp−2−αl n E ( max 1≤j≤n ∣ ∣Sj ∣ ∣ − εn ) < ∞. 1.10 Remark 1.8. If we take l x ≡ 1 in Theorem 1.4, then we can not only get the Baum-Katztype Theorem for martingale difference sequence but also consider the case of pα 1. Furthermore, if we take l x ≡ 1, α 1, and p 2 in Theorem 1.4, then we can get the Hsu-Robbins-type Theorem see Hsu and Robbins 1 for martingale difference sequence. Remark 1.9. As stated above, our Theorems 1.4 and 1.5 not only generalize the corresponding results of Theorems B and C for the partial sum to the maximal partial sum but also expand the scope of α and p. Remark 1.10. If we take l x ≡ 1 in Theorem 1.4, then we can get the Marcinkiewicz-Zygmund strong law of large numbers for martingale difference sequence as follows: 1 nα n ∑ i 1 Xi −→ 0, a.s. 1.11 2. Preparations To prove the main results of the paper, we need the following lemmas. Lemma 2.1 see 6, Theorem 2.11 . If {Xi,Fi, 1 ≤ i ≤ n} is a martingale difference and q > 0, then there exists a constant C depending only on p such that E ( max 1≤k≤n ∣ ∣ ∣ ∣ ∣ k ∑ i 1 Xi ∣ ∣ ∣ ∣ ∣ q) ≤ C ⎧ ⎨ ⎩ E ( n ∑ i 1 E ( X2 i | Fi−1 ) )q/2 E ( max 1≤i≤n |Xi| ) ⎫ ⎬ ⎭ . 2.1 Lemma 2.2. Let {Xn, n ≥ 1} be a sequence of random variables, which is stochastically dominated by a random variable X. Then for any a > 0 and b > 0, the following two statements hold: E |Xn|aI |Xn| ≤ b ] ≤ C1 EXI |X| ≤ b bP |X| > b , E |Xn|aI |Xn| > b ] ≤ C2E |X|aI |X| > b , 2.2 where C1 and C2 are positive constants. Lemma 2.3 cf. 7 . If l x > 0 is a slowly varying function as x → ∞, then i limx→∞ l tx /l x 1 for each t > 0; limx→∞ l x u /l x 1 for each u ≥ 0, ii limk→∞sup2k≤x<2k 1 l x /l 2 k 1, iii limx→∞xl x ∞, limx→∞x−δl x 0 for each δ > 0, Abstract and Applied Analysis 5 iv C12 l ε2 ≤ ∑k j 1 2 jr l ε2 ≤ C22 l ε2 for every r > 0, ε > 0, positive integer k and some C1 > 0, C2 > 0, v C32 l ε2 ≤ ∑∞ j k 2 jr l ε2 ≤ C42 l ε2 for every r < 0, ε > 0, positive integer k and some C3 > 0, C4 > 0. 3. Proofs of the Main Results Proof of Theorem 1.4. For fixed n ≥ 1, denote Yni XiI |Xi| ≤ n − E XiI |Xi| ≤ n | Fi−1 , i 1, 2, . . . . 3.1 Since Xi XiI |Xi| > n Yni E XiI |Xi| ≤ n | Fi−1 , we can see that ∞ ∑ n 1 nαp−2l n P ( max 1≤j≤n ∣ ∣Sj ∣ ∣ ≥ εn )and Applied Analysis 5 iv C12 l ε2 ≤ ∑k j 1 2 jr l ε2 ≤ C22 l ε2 for every r > 0, ε > 0, positive integer k and some C1 > 0, C2 > 0, v C32 l ε2 ≤ ∑∞ j k 2 jr l ε2 ≤ C42 l ε2 for every r < 0, ε > 0, positive integer k and some C3 > 0, C4 > 0. 3. Proofs of the Main Results Proof of Theorem 1.4. For fixed n ≥ 1, denote Yni XiI |Xi| ≤ n − E XiI |Xi| ≤ n | Fi−1 , i 1, 2, . . . . 3.1 Since Xi XiI |Xi| > n Yni E XiI |Xi| ≤ n | Fi−1 , we can see that ∞ ∑ n 1 nαp−2l n P ( max 1≤j≤n ∣ ∣Sj ∣ ∣ ≥ εn )


Introduction
The concept of complete convergence was introduced by Hsu and Robbins 1 as follows.A sequence of random variables {U n , n ≥ 1} is said to converge completely to a constant C if ∞ n 1 P {|U n − C| > ε} < ∞ for all ε > 0. In view of the Borel-Cantelli lemma, this implies that U n → C almost surely a.s. .The converse is true if the {U n , n ≥ 1} are independent.Hsu and Robbins 1 proved that the sequence of arithmetic means of independent and identically distributed i.i.d.random variables converges completely to the expected value if the variance of the summands is finite.Erd ös 2 proved the converse.The result of Hsu-Robbins-Erd ös is a fundamental theorem in probability theory and has been generalized and extended in several directions by many authors.One of the most important generalizations is Baum and Katz 3 for the strong law of large numbers as follows.
Theorem A see Baum and Katz 3 .Let α > 1/2 and let αp > 1.Let {X n , n ≥ 1} be a sequence of independent and identically distributed random variables.Assume further that EX 1 0 if α ≤ 1.Then the following statements are equivalent: Motivated by Baum and Katz 3 for independent and identically distributed random variables, many authors studied the Baum-Katz-type Theorem for dependent random variables; see, for example, ϕ-mixing random variables, ρ-mixing random variables, negatively associated random variables, martingale difference sequence, and so forth.
Our emphasis in the paper is focused on the Baum-Katz-type Theorem for martingale difference sequence.Recently, Stoica 4, 5 considered the following series that describes the rate of convergence in the strong law of large numbers: 1.1 They obtained the follow results.
The main purpose of the paper is to further study the Baum-Katz-type Theorem for martingale difference sequence.We have the following generalizations.
i Our results include Baum-Katz-type Theorem and Hsu-Robbins-type Theorem see Hsu and Robbins 1 as special cases.
ii Our results generalize Theorems B and C for the partial sum to the case of maximal partial sum.
Throughout the paper, let {X n , n ≥ 1} be a sequence of random variables defined on a fixed probability space Ω, F, P .Denote S n n i 1 X i , S 0 0, ln x ln max x, e , x xI x ≥ 0 , and F 0 {Ω, ∅}. a n b n stands for a n O b n .C, C 1 -C 4 denote positive constants which may be different in various places.x denotes the integer part of x.Let I A be the indicator function of the set A.
Let {F n , n ≥ 1} be an increasing sequence of σ fields with for all x ≥ 0 and n ≥ 1.
Our main results are as follows.
Theorem 1.4.Let α > 1/2, p > 1 and let αp ≥ 1.Let {X n , F n , n ≥ 1} be a martingale difference sequence, which is stochastically dominated by a random variable X.Let l x > 0 be a slowly varying function as Theorem 1.5.Let α > 1/2, p > 1 and let αp > 1.Let {X n , F n , n ≥ 1} be a martingale difference sequence, which is stochastically dominated by a random variable X.Let l x > 0 be a slowly varying function as For p 1 and l x 1, we have the following theorem.
Theorem 1.6.Let α ≥ 1, and let {X n , F n , n ≥ 1} be a martingale difference sequence, which is stochastically dominated by a random variable X. Supposing that The following theorem presents the complete moment convergence for martingale difference sequence.Theorem 1.7.Letting the conditions of Theorem 1.4 hold, then for any ε > 0, Remark 1.8.If we take l x ≡ 1 in Theorem 1.4, then we can not only get the Baum-Katztype Theorem for martingale difference sequence but also consider the case of pα 1.Furthermore, if we take l x ≡ 1, α 1, and p 2 in Theorem 1.4, then we can get the Hsu-Robbins-type Theorem see Hsu and Robbins 1 for martingale difference sequence.
Remark 1.9.As stated above, our Theorems 1.4 and 1.5 not only generalize the corresponding results of Theorems B and C for the partial sum to the maximal partial sum but also expand the scope of α and p.
Remark 1.10.If we take l x ≡ 1 in Theorem 1.4, then we can get the Marcinkiewicz-Zygmund strong law of large numbers for martingale difference sequence as follows: 1.11

Preparations
To prove the main results of the paper, we need the following lemmas.
Lemma 2.1 see 6, Theorem 2.11 .If {X i , F i , 1 ≤ i ≤ n} is a martingale difference and q > 0, then there exists a constant C depending only on p such that Lemma 2.2.Let {X n , n ≥ 1} be a sequence of random variables, which is stochastically dominated by a random variable X.Then for any a > 0 and b > 0, the following two statements hold: where C 1 and C 2 are positive constants.

Proofs of the Main Results
Proof of Theorem 1.4.For fixed n ≥ 1, denote : H I J.

3.2
For H, we have by Markov's inequality, Lemma 2.2, and 1.5 that

3.3
For I, we have by Markov's inequality and 3.3 that

3.4
To prove 1.6 , it suffices to show that

3.5
For fixed n ≥ 1, it is easily seen that {Y ni , F i , i ≥ 1} is still a martingale difference.By Markov's inequality and Lemma 2.1, we have that for any q ≥ 2, : J 1 J 2 .

8 Abstract and Applied Analysis
Case 2 αp > 1 and p < 2 .Take q 2. Similar to the proof of 3.6 and 3.7 , we can get that
From the statements mentioned previously, we have proved 3.5 .This completes the proof of the theorem.
Proof of Theorem 1.5.We have by Lemma 2.3 that

3.10
The desired result 1.7 follows from the inequality above and 1.6 immediately.
Proof of Theorem 1.6.We use the same notation as that in Theorem 1.4.According to the proof of Theorem 1.4, we can see that J < ∞ for p 1 and l x 1 under the conditions of Theorem 1.6.So it suffices to show that H < ∞ and I < ∞ for p 1 and l x 1.
Similar to the proof of 3.3 , we have

3.11
Similar to the proof of 3.4 and 3.11 , we can get that

3.12
This completes the proof of the theorem.
Proof of Theorem 1.7.For any ε > 0, we have by Theorem 1.4 that S j > t dt.

3.13
Hence, it suffices to show that

3.16
Similar to the proof of 3.3 , we have by Markov's inequality and Lemma 2.2 that

3.17
According to the proof of 3.17 , we have by Markov's inequality and Lemma 2.2 that

3.18
For any t > 0, it is easily seen that {Z ti , F i , i ≥ 1} is still a martingale difference.By Markov's inequality and Lemma 2.1, we have that for any q ≥ 2,

3.20
Hence, similar to the proof of 3.7 , we can see that E|X| p l |X| 1/α < ∞.
From the statements mentioned previously, we have proved 3.14 .This completes the proof of the theorem.
then σ fields {F n , n ≥ 1} are said to be adapted to the sequence {X n , n ≥ 1}, and {X n , F n , n ≥ 1} is said to be an adapted stochastic sequence.