next up previous
Next: norms Up: Measuring the magnitude of Previous: The Lévy Property


Tail distributions

In this section, we will state and prove the formula for the tail distribution of the sum of independent, real valued, random variables that satisfy the Lévy Property.

If one restricts the formula to the case of sums of independent, identically distributed random variables, one obtains a formula very similar to the main result of Hahn and Klass (1997). The main differences are that their inequality involves one sided inequalities, and also that their inequality is more precise.

This formula also has a strong resemblance to the result of Lata\la. As we shall show in Section 6, computing the $ L_p$ norm of $ U$ is effectively equivalent to computing $ U^*(e^{-p})$. Then if one notices that $ (1+x)^p$ is very close to $ e^{xp}$ for small positive $ x$, one can see that this result and the result of Lata\la are very closely related. Presumably one could derive Lata\la's result by combining Theorem 5.1 with Theorem 6.1. However the technical difficulties are quite tricky, and since Lata\la's proof is elegant, we will not carry out this program here.

Theorem 5.1   Let $ (X_n)$ be a sequence of real valued independent random variables satisfying the strong Lévy property. Define the functions $ F_1(t)$ and $ F_2(t)$ to be 0 if $ t > 1$, and if $ 0 \le t \le
1$,
$\displaystyle F_1(t)$ $\displaystyle =$ $\displaystyle \inf\left\{ \lambda>0 :\prod_n
{\mathbb{E}}(t^{X_n^{(\le \ell(t))...
... }
\prod_n
{\mathbb{E}}(t^{-X_n^{(\le \ell(t))}/\lambda}) \le t^{-1} \right\} ,$  
$\displaystyle F_2(t)$ $\displaystyle =$ $\displaystyle \inf\left\{ \lambda>0 :\prod_n {\mathbb{E}}(t^{X_n^{(\le
M^*(t))}...
...d }
\prod_n {\mathbb{E}}(t^{-X_n^{(\le
M^*(t))}/\lambda}) \le t^{-1} \right\} .$  

Then

$\displaystyle S^*(t)
\mathrel{\mathop{\approx}\limits_{t}} F_1(t) \mathrel{\mathop{\approx}\limits_{t}} F_2(t) ,$

where the constants of approximation depend only upon the strong Lévy constants of $ (X_n)$.

Let us start with gaining some understanding of Orlicz spaces. There is a huge literature on Orlicz spaces, see for example Lindenstrauss and Tzafriri (1977). Suppose that $ \Phi:[0,\infty) \to
[0,\infty]$ is an increasing function (usually convex with $ \Phi(0) =
0$). Then the Orlicz norm of a random variable $ X$ is defined according to the formula

$\displaystyle {\mathopen\Vert X\mathclose\Vert}_\Phi = \inf\{ \lambda>0 : {\mathbb{E}}\Phi({\mathopen\vert X\mathclose\vert}/\lambda) \le 1
\}.$

We will be concerned with the special functions

$\displaystyle \Phi_t(x) =
{\frac{t^{-x} - 1 }{ t^{-1} - 1}} .$

The following is a special case of results that appear in Montgomery-Smith (1992).

Lemma 5.2   For any random variable $ X$, and for $ t \le 1/4$, we have that

$\displaystyle {\mathopen\Vert X\mathclose\Vert}_{\Phi_t} \approx \sup_{0\le x \le 1} {\frac{\log(t) }{
\log(xt)}}
X^*(x) ,$

with constants of approximation bounded by $ 2$.


Proof: Suppose first that $ {\mathopen\Vert X\mathclose\Vert}_{\Phi_t} \le 1$. Then $ {\mathbb{E}}\Phi_t(X) \le 1$, which implies that

$\displaystyle x t^{-X^*(x)}
\le
\int_0^1 t^{-X^*(y)} \, dy
\le
{\mathbb{E}}(t^{-{\mathopen\vert X\mathclose\vert}})
\le t^{-1} ,$

that is, $ X^*(x) \le \log(xt)/\log(t)$.

Conversely, suppose that $ X^*(x) \le \log(xt)/\log(t)$ for $ 0\le x \le
1$. Then

$\displaystyle E\Phi_t(X/2)\le \int_0^1 \Phi_t\left(\frac{\log(xt)}{
2\log(t)}\right) \, dx = {\frac{2t^{-1/2}-1
}{ t^{-1}-1}} \le 1 .$


Proof of Theorem 5.1: Let us start with the proof that $ S^*(t
) \mathrel{\mathop{\approx}\limits_{t}} F_1(t)$. Since the random variables $ X_n^{(\le \ell(t))}$ are independent, we have that

$\displaystyle F_1(t) =\inf\left\{ \lambda>0 :
{\mathbb{E}}(t^{S^{(\le \ell(t))...
...text{ and }
{\mathbb{E}}(t^{-S^{(\le \ell(t))}/\lambda}) \le t^{-1} \right\} .$

Now we notice that for any random variable $ Y$, and $ 0 \le t \le
1$, we have that

$\displaystyle {\textstyle {\frac{1}{ 2}}} {\mathbb{E}}(t^{-{\mathopen\vert Y\ma...
...mathbb{E}}(t^{-Y})\} \le
{\mathbb{E}}(t^{-{\mathopen\vert Y\mathclose\vert}}) .$

Hence

$\displaystyle F_1(t) \le\inf\left\{ \lambda>0 :
{\mathbb{E}}(t^{-{\mathopen\ve...
...t^{-1} \right\} =
{\mathopen\Vert S^{(\le \ell(t))}\mathclose\Vert}_{\Phi_t} ,$

and

$\displaystyle F_1(t)\ge\inf\left\{ \lambda>0 :{\mathbb{E}}(t^{-{\mathopen\vert ...
...2t^{-1} \right\} = {\mathopen\Vert S^{(\le \ell(t))}\mathclose\Vert}_{\Psi_t} ,$

where $ \Psi_t(x) = {\displaystyle{\frac{t^{-x} - 1}{ 2t^{-1} - 1}}}$. However, we quickly see that for $ x \ge 0$ that if $ t\le 1/2$ then $ \Psi_t(x) \ge {\frac{1}{ 3}} \Phi_t(x) \ge \Phi_t(x/3)$, since $ \Phi_t$ is a convex function. Hence

$\displaystyle F_1(t)
\approx {\mathopen\Vert S^{(\le \ell(t))}\mathclose\Vert}_{\Phi_t} $

with constants of approximation bounded by $ 3$.

Next, we apply Lemma 5.2, and we see that

$\displaystyle F_1(t)
\approx
\sup_{0\le x \le 1}{\frac{\log(t) }{ \log(xt)}} (S^{(\le \ell(t))})^*(x) .$

Taking $ x = t$, we see that the right hand side is bounded below by $ {\frac{1}{ 2}} (S^{(\le\ell(t))})^*(t)$. Also, if $ t \le x \le 1$, then

$\displaystyle {\frac{\log(t) }{ \log(xt)}} (S^{(\le \ell(t))})^*(x)
\le (S^{(\le \ell(t))})^*(t) .$

Further, by Corollary 3.2 combined with Proposition 4.1, there exist constants $ c_1$ and $ c_2$, depending only on the Lévy constants of $ (X_n)$, such that if $ 0 \le x \le t \le c_1^{-1}$, then
$\displaystyle {\frac{\log(t) }{ \log(xt)}} (S^{(\le \ell(t))})^*(x)$ $\displaystyle \le$ $\displaystyle c_2 {\frac{\log(x) }{ \log(xt)}} \bigl((S^{(\le \ell(t))})^*(t)+
(M^{(\ell(t))})^*(x/2)\bigr)$  
  $\displaystyle \le$ $\displaystyle c_2 \bigl((S^{(\le \ell(t))})^*(t) + \ell(t)\bigr) .$  

Now, applying Proposition 2.1, Proposition 4.1, and Lemma 4.2 part (v), we finally obtain the desired result.

To show that $ S^*(t) \mathrel{\mathop{\approx}\limits_{t}} F_2(t)$ is an almost identical proof.


next up previous
Next: norms Up: Measuring the magnitude of Previous: The Lévy Property
Stephen Montgomery-Smith 2002-10-30