next up previous
Next: Tail distributions Up: Measuring the magnitude of Previous: The Klass-Nowicki Inequality


The Lévy Property

Let $ (X_n)$ be a sequence of independent random variables. We will say that $ (X_n)$ satisfies the Lévy property with constants $ c_1$ and $ c_2$ if whenever $ A
\subseteq B
\subseteq {\mathbb{N}}$, with $ A$ and $ B$ finite, then for $ t>0$

$\displaystyle \Pr({\mathopen\vert S_A\mathclose\vert} > c_1 t)\le c_2 \Pr({\mathopen\vert S_B\mathclose\vert} > t) .$

The casual reader should beware that this property has nothing to do with Lévy processes.

The sequence $ (X_n)$ has the strong Lévy property with constants $ c_1$ and $ c_2$ if for all $ s>0$ the sequence $ (X_n^{(\le s)})$ has the Lévy property with constants $ c_1$ and $ c_2$.

Here are examples of sequences with the strong Lévy property. (It may be easily seen that in all these cases it is sufficient to show that they have the Lévy property.)

  1. Positive sequences, with constants $ 1$ and $ 1$.
  2. Sequences of symmetric random variables with constants $ 1$ and $ 2$. This ``reflection property'' plays a major role in results attributed to Lévy, hence the name of the property.
  3. Sequences of identically distributed random variables. This was shown independently by Montgomery-Smith (1993) with constants $ 10$ and $ 3$, and by Lata\la (1993) with constants $ 5$ and $ 4$, or $ 7$ and $ 2$.
We see that sequences with the Lévy property satisfy a maximal inequality.

Proposition 4.1   Let $ (X_n)$ be a sequence of independent random satisfying the Lévy property with constants $ c_1$ and $ c_2$. Then for all $ t>0$

$\displaystyle \Pr( U > 3 c_1 t) \le3 c_2 \Pr ( {\mathopen\vert S\mathclose\vert} > t ) .$

Thus $ M^*(t) \le 6 c_1
S^*(t/3c_2)$.


Proof: The first statement is an immediate corollary of the following result known as Lévy-Ottaviani inequality:

$\displaystyle \Pr(U_N > 3 t)\le3 \sup_{1 \le k \le N} \Pr( {\mathopen\vert S_k\mathclose\vert} > t ) .$

(Billingsley (1995, Theorem 22.5, p. 288) attributes this result to Etemadi (1985) who proved it with constants 4 in both places, but the same proof gives constants 3; see, for example, Billingsley. However the first named author learned this result from Kwapien in 1980.)

The second statement follows from the first, since $ M
\le 2 U$.

We end with a lemma that lists some elementary properties. Part (i) of the lemma might be thought of as a kind of reduced comparison principle.

Lemma 4.2   Let $ (X_n)$ be a sequence of random variables satisfying the strong Lévy property.
  1. There exist positive constants $ c_1$ and $ c_2$, depending only upon the Lévy constants of $ (X_n)$, such that if $ s \le 1/2$ and $ 0 \le t \le
1$, then

    $\displaystyle (S^{(\le M^*(s))})^*(t) \le c_1
S^*(c_2^{-1} t) .$

  2. There exist positive constants $ c_1$ and $ c_2$, depending only upon the strong Lévy constants of $ (X_n)$, such that if $ r \le s \le 1/2$, and if $ 0 \le t \le
1$, then $ (S^{(\le M^*(s))})^*(t) \le c_1 (S^{(\le M^*(r))})^*(c_2^{-1}t)$.
  3. If $ 0 \le s \le t \le 1$, then $ S^*(t) \le (S^{(\le
M^*(s))})^*(t-s)$, and $ (S^{(\le M^*(s))})^*(t) \le S^
*(t-s)$. In particular, $ S^*(t) \le (S^{(\le
M^*(t/2))})^*(t/2)$, and $ (S^{(\le
M^*(t/2))})^*(t) \le S^*(t/2)$.
  4. For $ \alpha,\beta > 0$, we have that

    $\displaystyle (S^{(\le M^*(t))})^*(t)
\mathrel{\mathop{\approx}\limits_{t}} (S^{(\le M^*(\alpha t))})^*(\beta t) $

    where the constants of approximation depend only upon $ \alpha$, $ \beta$ and the strong Lévy constants of $ (X_n)$.
  5. We have that

    $\displaystyle S^*(t) \mathrel{\mathop{\approx}\limits_{t}} (S^{(\le M^*(t))})^*(t) \mathrel{\mathop{\approx}\limits_{t}}
(S^{(\le \ell(t))})^*(t) ,$

    where the constants of approximation depend only upon the strong Lévy constants of $ (X_n)$.


Proof: Let us start with part (i). For each set $ A \subseteq {\mathbb{N}}$, define the event

$\displaystyle E_A = \{ {\mathopen\vert X_n\mathclose\vert} \le M^*(s)$    if and only if $\displaystyle n \in A \} .$

Note that the whole probability space is the disjoint union of these events. Also

$\displaystyle \{ {\mathopen\vert S^{(\le M^*(s))}\mathclose\vert} > x \} \cap E_A
=
\{ {\mathopen\vert S_A\mathclose\vert} > x \} \cap E_A .$

Furthermore, by independence, we see that
    $\displaystyle \Pr({\mathopen\vert S_A\mathclose\vert} > x$    and $\displaystyle E_A)$  
    $\displaystyle = \Pr({\mathopen\vert S_A\mathclose\vert} > x$    and $\displaystyle {\mathopen\vert X_n\mathclose\vert} \le M^*(s)$    for $\displaystyle n \in A)
\Pr({\mathopen\vert X_n\mathclose\vert} > M^*(s)$    for $\displaystyle n \notin A) .$  

Hence
    $\displaystyle \Pr({\mathopen\vert S^{(\le M^*(s))}\mathclose\vert} > x)$  
    $\displaystyle = \sum_{A \subseteq {\mathbb{N}}}
\Pr({\mathopen\vert S_A\mathclose\vert} > x$    and $\displaystyle {\mathopen\vert X_n\mathclose\vert} \le M^*(s)$    for $\displaystyle n \in A)
\Pr({\mathopen\vert X_n\mathclose\vert} > M^*(s)$    for $\displaystyle n \notin A)$  
    $\displaystyle \le
2 \sum_{A \subseteq {\mathbb{N}}}
\Pr({\mathopen\vert S_A\mathclose\vert} > x)
\Pr({\mathopen\vert X_n\mathclose\vert} \le M^*(s)$    for $\displaystyle n \in A)
\Pr({\mathopen\vert X_n\mathclose\vert} > M^*(s)$    for $\displaystyle n \notin A)$  
    $\displaystyle \le c_2 \Pr({\mathopen\vert S\mathclose\vert} > c_1^{-1} x),$  

where in the first inequality we have used the fact that

$\displaystyle \Pr({\mathopen\vert X_n\mathclose\vert} \le M^*(s)$    for $\displaystyle n \in A) \ge \Pr(M \le M^*(s))
\ge 1-s \ge 1/2 .$

Part (ii) follows by applying part (i) to $ S^{(\le M^*(r))}$.

Part (iii) follows from the observation that

$\displaystyle \Pr(S \ne S^{(\le M^*(s))})\le\Pr(M > M^*(s)) \le s .$

Hence, if $ \Pr(S > \alpha) \ge t$, then $ \Pr(S^{(\le M^*(s))} >
\alpha)
\ge t-s$, and conversely, if $ \Pr(S^{(\le M^*(s))} > \alpha) > t$ then $ \Pr(S >\alpha) \ge t-s$.

To show part (iv), we may suppose without loss of generality that $ \alpha=1$ and $ \beta > 1$. Clearly $ S^{(\le M^*(t))}(t) \ge S^{(\le M^*(t))}(\beta t)$, so we need only show an opposite inequality. From part (ii), there are positive constants $ c_1$ and $ c_2$, depending only upon the strong Lévy constants of $ (X_n)$, such that for $ 0\le t \le 1/2$

$\displaystyle S^{(\le M^*(t))}(t)
\le c_1 S^{(\le M^*(c_2^{-1}\beta^{-1})t)}(c_2^{-1} t)
\le c_1 S^{(\le M^*(c_3^{-1}t)}(c_3^{-1} \beta t) ,$

where $ c_3 = c_2 \beta$.

Part (v) follows easily by combining part (iii), part (iv), and Proposition 2.1.


next up previous
Next: Tail distributions Up: Measuring the magnitude of Previous: The Klass-Nowicki Inequality
Stephen Montgomery-Smith 2002-10-30