WHAT RANDOM VARIABLE GENERATES A BOUNDED POTENTIAL?

It is known that if a predictable nondecreasing process generates a bounded potential, then its final value satisfies the Garsia inequality. We prove the converse, that is, a random variable satisfying the Garsia inequality generates a bounded potential. We also propose some useful relations between the Garsia inequality and the Cramer conditions, and different ways how to construct a potential.


Introduction
Let (Ω,Ᏺ,P) be a complete probability space and let {Ᏺ t , t ≥ 0} be a filtration satisfying the standard conditions: Ᏺ 0 contains P zero sets of σ-field Ᏺ, Ᏺ s ⊂ Ᏺ t ⊂ Ᏺ, 0 ≤ s ≤ t, Ᏺ s = t>s Ᏺ t .Also, let {X t ,Ᏺ t , t > 0} be the potential with the Doob-Meyer decomposition of the form where M t is a martingale, A t is nondecreasing integrable right-continuous predictable process, A 0 = 0. Let A ∞ := lim t→∞ A t .It follows from (1.1) that X t also admits the decomposition and we say that the process A t generates the potential X t .Garsia [2] established that in the case when X t is bounded, that is, 0 ≤ X t ≤ c 0 , the random variable (r.v.) ξ := A ∞ satisfies the Garsia inequality EG(ξ) ≤ c 0 Eg(ξ), (1.3) where g = g(t) : R + → R + is any nondecreasing nonnegative function, G(t) =
Remark 2.4.If we consider F ξ (x) = P{ξ ≤ x}, that is, right-continuous F ξ (x), and ξ has no atom at zero, the proof of Theorem 2.2 will be the same.But if ξ has an atom at zero and F ξ (x) is right-continuous, then we must consider λ(x) with some additional term, 100 What random variable generates a bounded potential?namely, (2.9) To avoid these technical difficulties, we therefore consider left-continuous functions.
Theorem 2.5.Let ξ be the same as in Theorem 2.2.The following assertions are equivalent: (iv) the r.v.ξ satisfies the Cramer conditions; where λ is defined by (2.1).Proof.It is well known (see [3]) that the Cramer conditions hold if and only if there exists an r > 0 such that E exp{rξ} < ∞.In turn, the last inequality is equivalent to By the same arguments, we obtain that (2.10) is equivalent to for some 0 < r < ∞ and 0 < c < ∞.From relation (2.3), we deduce that the Cramer conditions are equivalent to the inequality for some r > 0, c 1 ∈ R, that is, equivalent to (v).
As mentioned above, the Cramer conditions follow from the Garsia inequality.The next example demonstrates that the converse is not true.
Example 2.6.There exists an integrable nonnegative r.v.ξ satisfying the Cramer conditions but not the Garsia inequality.The idea is the following.If we choose a nonnegative left-continuous function λ(x) such that lnλ(x) − x 0 λ(u)du decreases to −∞, and then the righthand side of (2.2) will be a tail of some d.f.F ξ (x).This d.f. will satisfy the Cramer conditions by Theorem 2.5, but the Garsia inequality will not hold by assumption (ii) of Theorem 2.2.
Three conditions mentioned above will be satisfied if the function λ is a solution of the boundary value problem where ϕ(x) = n≥0 I [2n,2n+1) (x).
The next example demonstrates that the inequality lim x→∞ λ(x) > 0 is not equivalent to the Cramer conditions.

What random variable generates a bounded potential?
Definition 3.1.A r.v.ξ is said to generate a bounded potential if there exists a filtration {G t , t ≥ 0} on (Ω,Ᏺ,P) that satisfies the standard conditions, and a bounded rightcontinuous G t -adapted potential X t that admits the expansion (compare with (1.2))Note that for a given X t , the predictable process in (3.1) is unique.Denote G 0 the family of all P zero sets of Ᏺ.
Proof.Choose any t 0 > 0 and set Theorem 3.5.Let a r.v.ξ satisfy Cramer conditions.Then ξ generates a nontrivial bounded potential.
Proof.According to Theorem 2.7 we can choose some c > 0 and a sequence where n ≥ 1.Then G t satisfies the standard conditions, A t is a G t -adapted nonnegative nondecreasing right-continuous process, and 104 What random variable generates a bounded potential?Since for any k < m and α < t k , the event We conclude that the potential X t is bounded and Theorem 3.6.Let a r.v.ξ satisfy the Garsia inequality.Then ξ predictably generates a nontrivial potential.
Proof.Let G t = σ{ξ ∧ s, s ≤ t}.Analogously to the proof of Theorem 3.5, we can establish that G t = σ{ξ ∧ t}.Also, let A t := ξ ∧ t, A ∞ = ξ.According to Theorem 2.2 and representation (2.1), Evidently, P{X t > 0} > 0, t > 0, and A t is continuous.Therefore, it is a predictable process and the proof follows.
Remark 3.7.The σ-fields G t participating in the proof of Theorem 3.6 were considered by Dellacherie [1], where the dual predictable projection of the process A = I [ξ,∞[ with respect to {G t , t ≥ 0} was constructed.
The next example demonstrates that the same r.v.ξ can generate rather different potentials with respect to the same filtration {G t , t ≥ 0}.
Example 3.8.Let {P t , t ≥ 0} be a homogeneous Poisson process with the intensity λ 0 , 1 .Also, let G t = σ{P s , 0 ≤ s ≤ t}.If we put A 0 t := t 0 θ(u)P u du, A ∞ := ξ, then the process A t is predictable, and the potential is nontrivial and bounded.

Some additional results
The next theorem states that in the case when lim x→∞ λ(x) > 0, an integrable r.v.ξ can be transformed into a r.v.ψ(x) that predictably generates a bounded potential.Note that in this case we can choose the sequence π We obtain from (4.2) and (4.3) that there exists Then A t is G t -predictable, A ∞ = ψ(ξ), and for any t > 0, The next result demonstrates that an integrable r.v.ξ with lim x→∞ λ(x) > 0 generates an "almost bounded" potential.Proof.Consider G t := σ{ξ ∧ t n }, t n − ε/2 n ≤ t < t n+1 − ε/2 n+1 , and A t being the same as in the proof of Theorem 3.5.Then A t is G t -predictable.Furthermore, 106 What random variable generates a bounded potential?Remark 4.3.Let the r.v.ξ be such that 0 = lim x→∞ λ(x) < lim x→∞ λ(x).Consider G t and A t from the proof of Theorem 3.5 and construct the compensator (dual predictable projection) A π t of A t .Evidently,

Theorem 4. 2 .
Let lim x→∞ λ(x) > 0. Then for any ε > 0, the r.v.ξ predictably generates a nontrivial potential that is bounded on the set R + B where m(B) < ε (m(•) is a Lebesgue measure on R).