Stationary increments

In probability theory, a stochastic process is said to have stationary increments if its change only depends on the time span of observation, but not on the time when the observation was started. Many large families of stochastic processes have stationary increments either by definition (e.g. Lévy processes) or by construction (e.g. random walks)

Definition

A stochastic process X = ( X t ) t 0 {\displaystyle X=(X_{t})_{t\geq 0}} has stationary increments if for all t 0 {\displaystyle t\geq 0} and h > 0 {\displaystyle h>0} , the distribution of the random variables

Y t , h := X t + h X t {\displaystyle Y_{t,h}:=X_{t+h}-X_{t}}

depends only on h {\displaystyle h} and not on t {\displaystyle t} .[1][2]

Examples

Having stationary increments is a defining property for many large families of stochastic processes such as the Lévy processes. Being special Lévy processes, both the Wiener process and the Poisson processes have stationary increments. Other families of stochastic processes such as random walks have stationary increments by construction.

An example of a stochastic process with stationary increments that is not a Lévy process is given by X = ( X t ) {\displaystyle X=(X_{t})} , where the X t {\displaystyle X_{t}} are independent and identically distributed random variables following a normal distribution with mean zero and variance one. Then the increments Y t , h {\displaystyle Y_{t,h}} are independent of t {\displaystyle t} as they have a normal distribution with mean zero and variance two. In this special case, the increments are even independent of the duration of observation h {\displaystyle h} itself.

Generalized Definition for Complex Index Sets

The concept of stationary increments can be generalized to stochastic processes with more complex index sets T {\displaystyle T} . Let X = ( X t ) t T {\displaystyle X=(X_{t})_{t\in T}} be a stochastic process whose index set T R {\displaystyle T\subset \mathbb {R} } is closed with respect to addition. Then it has stationary increments if for any p , q , r T {\displaystyle p,q,r\in T} , the random variables

Y 1 = X p + q + r X q + r {\displaystyle Y_{1}=X_{p+q+r}-X_{q+r}}

and

Y 2 = X p + r X r {\displaystyle Y_{2}=X_{p+r}-X_{r}}

have identical distributions. If 0 T {\displaystyle 0\in T} it is sufficient to consider r = 0 {\displaystyle r=0} .[1]

References

  1. ^ a b Klenke, Achim (2008). Probability Theory. Berlin: Springer. p. 190. doi:10.1007/978-1-84800-048-3. ISBN 978-1-84800-047-6.
  2. ^ Kallenberg, Olav (2002). Foundations of Modern Probability (2nd ed.). New York: Springer. p. 290.