Chapter 1: The basic concept of stochastic process
One 、 The definition of stochastic process
Part 1: A random variable
Before introducing stochastic processes , Let's first recall how random variables are defined . This part is the content of probability , Here we mainly make a brief review of some useful concepts .
A basic concept in probability theory is random experiment , The results of this test cannot be predetermined . The set of all possible basic results of a random experiment is called the sample space of this experiment , Write it down as \(S\) . Random variables are defined in the sample space \(S\) Real valued single valued function on , It gives sample space \(S\) Each result in specifies a real value corresponding to it .
Definition of random variables : Let the sample space of the random trial be \(S\) , if \(X=X(e)\) For definition in sample space \(S\) Real valued single valued function of , said \(X=X(e)\) Is a random variable . there \(e\in S\) Elements representing the sample space , Called sample points .
In the way of mapping , We can express random variables as \(X(e):S\to \mathbb{R}\) .
Part 2: random process
A random process is a family of random variables , It is mainly used to describe random phenomena that change with time .
The definition of stochastic process : set up \(S\) It's the sample space ,\(T\subset\mathbb{R}\) , If the \(\forall t\in T\) ,\(X(t)\) yes \(S\) The random variable on , said \(\{X(t):t\in T\}\) yes \(S\) Random process on , call \(T\) Time parameter space .
In the way of mapping , We can express the random process as \(X(t,e):T\times S\to\mathbb{R}\) . It can be seen that , Random process is a binary real valued single valued function . However, such a definition is somewhat abstract , It is easier to understand by considering each parameter separately .
- Any given \(t\in T\) , Then there are \(X(t,\cdot)\) yes \(S\to\mathbb{R}\) The function on , That is to say \(S\) The random variable on , Its meaning is random process in \(t\) The state of the moment .
- Any given \(e\in S\) , Then there are \(X(\cdot,e)\) yes \(T\to\mathbb{R}\) The function on , It's called a sample trajectory or a sample curve of a random process , Its meaning is random process in \(T\) Last implementation .
We will \(X(t)\) All possible values of are called state space , Write it down as \(I\) . According to the time parameter space \(T\) And state space \(I\) Different categories of , We can divide the stochastic process into the following four cases : Discrete time discrete state stochastic process 、 Discrete time continuous state stochastic process 、 Continuous time discrete state stochastic process 、 Continuous time continuous state stochastic process .
Two 、 Finite dimensional distribution and numerical features
Part 1: Finite dimensional distribution
Because the state of a random process at any time is a random variable , Therefore, we can also use probability distribution and numerical characteristics to characterize the statistical properties of a random process . First, we introduce the probability distribution of random processes , Here we need to introduce the concept of finite dimensional distribution .
One dimensional distribution function : Given a random process \(\{X(t):t\in T\}\) , For each fixed \(t\in T\) , A random variable \(X(t)\) The distribution function of is generally similar to \(t\) of , Write it down as
It's called a random process \(\{X(t):t\in T\}\) One dimensional distribution function of . When \(t\) Take over \(T\) All elements in , We can get a series of distribution functions , The set composed of all these distribution functions is called one-dimensional distribution function family , Write it down as \(\left\{F_X(x;t):t\in T\right\}\) . The one-dimensional distribution function family characterizes the statistical properties of random processes at each individual time .
Two dimensional distribution function : For fixed \(s,t\in T\) , Put binary random variables \((X(s),\,X(t))\) The joint distribution function of is written as
It's called a random process \(\{X(t):t\in T\}\) Two dimensional distribution function of .
\(n\) Dimensional distribution function : In order to describe the relationship between the states of random processes at different times , To any \(n\) A different moment \(t_1,t_2,\cdots,t_n\in T\) introduce \(n\) Dimensional random variable \((X(t_1),X(t_2),\cdots,X(t_n))\) Distribution function of , Write it down as
It's called a random process \(\{X(t):t\in T\}\) Of \(n\) Dimensional distribution function . For fixed \(n\) , Similarly, we will all \(n\) The set of dimensional distribution functions is called \(n\) Dimensional distribution function family , Write it down as \(\{F_{t_1,t_2,\cdots,t_n}(x_1,x_2,\cdots,x_n):t_i\in T,i=1,2,\cdots,n\}\) . When \(n\) Sufficiently large ,\(n\) The dimensional distribution function can approximately describe the statistical properties of random processes . We put all possible \(n\) Dimensional distribution is collectively referred to as finite dimensional distribution .
If we further let \(n\) Take all positive integers , Put the elements of any dimensional distribution function family together , You can get a larger set . This is a random process \(\{X(t):t\in T\}\) A family of finite dimensional distribution functions , Write it down as
For the sake of understanding , We can express the family of finite dimensional distribution functions by the union of sets :
Kolmogorov Theorem : The finite dimensional distribution function family completely determines the statistical properties of random processes .
The above is about the probability distribution of random processes , Conceptually, it is more abstract , It needs to be practiced in combination with the topic . There is a point to emphasize here , Random variables of random process at different time points are not necessarily independent , Its joint distribution needs to be calculated according to the nature of the specific process .
Part 2: Digital features
Kolmogorov Theorem can tell us , The finite dimensional distribution function family of a random process contains all the information about the random process . But in practice , It is difficult for us to determine a complete family of finite dimensional distribution functions for random processes , Therefore, we need to introduce some numerical characteristics to reflect the main properties of stochastic processes . In a general way , The numerical characteristics of stochastic processes are defined in the time parameter space \(T\) The function on .
For random processes \(\{X(t):t\in T\}\) , We mainly study its mean function 、 Variance function 、 Autocovariance function and autocorrelation function . Here we will also introduce some other digital features induced by these digital features , For example, mean square function and standard deviation function .
Mean function :\(\mu_X(t)={\rm E}[X(t)]\) .
Second order moment function :\(\psi_X^2(t)={\rm E}\left[X(t)\right]^2\) .
Variance function :\(\sigma_X^2(t)={\rm Var}(X(t))\) .
Standard deviation function :\(\sigma_X(t)=\sqrt{\sigma_X^2(t)}\) .
Autocorrelation function :\(r_X(s,t) ={\rm E}[X(s)X(t)]\) .
Auto covariance function :\(C_X(s,t) ={\rm Cov}(X(s),X(t))\) .
Utilization expectation 、 The operational properties of variance and covariance , We can get the relationship between the numerical characteristics of random processes :
- The second-order moment function and autocorrelation function have the following relationship :
- Equation function and auto covariance function have the following relationship :
- Autocovariance function and autocorrelation function have the following relationship :
- Autocorrelation function and autocovariance function are symmetric functions :
Part 3: Special stochastic processes
In the numerical characteristics of random processes , We pay most attention to the mean function and auto covariance function . On the one hand, other digital features can be induced by mean function and auto covariance function , On the other hand, the mean function and auto covariance function have summarized the core properties of stochastic processes . We start from these two digital features , Introduce some special stochastic processes .
Second moment process : If for each \(t\in T\) , random process \(\{X(t):t\in T\}\) The second moment of \({\rm E}[X(t)]^2\) All exist , Then the random process is called a second-order moment process . Here is the second moment \({\rm E}[X(t)]^2\) The meaning of existence is \({\rm E}\left[X(t)\right]^2<\infty\) . Can prove that , The mean function of the second moment process 、 Variance function 、 Both autocorrelation function and autocovariance function exist .
Normal process : For random processes \(\{X(t):t\in T\}\) , If every finite dimensional distribution of it is normal , Then the random process is called normal process or Gaussian process . Normal process is a special second moment process . The statistical properties of normal process are completely determined by its mean function and auto covariance function , That is, the finite dimensional distribution of a normal process is completely determined by its mean function and auto covariance function .
White noise process : Let's set up a random process \(\{X(t):t\in T\}\) It's a zero mean stochastic process , If to any \(s\neq t\) There are \(r_X(s,\,t)=0\) , The random process is called white noise process .
3、 ... and 、 Two dimensional random process
Part 1: Two dimensional random process
In practical terms , Sometimes we need to study two or more random processes and the statistical relationship between them . In addition to studying the statistical properties of each random process separately , We also need to take several random processes as a whole , Further study its statistical properties . Here we mainly discuss the definition of two-dimensional random process and the independence of random process .
Definition of two-dimensional random process : set up \(\{X(t):t\in T\}\) and \(\{Y(t):t\in T\}\) It depends on the same time parameter \(t\in T\) The random process of the , For arbitrary \(t\in T\) , There are \((X(t),Y(t))\) Is a two-dimensional random vector , said \(\{(X(t),Y(t)):t\in T\}\) It is a two-dimensional random process .
Independence of stochastic processes : For random processes \(\{X(t):t\in T\}\) and \(\{Y(t):t\in T\}\) , If for any positive integer \(n\) and \(m\) And any real number \(t_1,t_2\cdots,t_n\in T\) and \(t_1',t_2',\cdots,t_m'\in T\) , There are \(n\) Dimension random vector \((X(t_1),X(t_2),\cdots,X(t_n))\) and \(m\) Dimension random vector \((Y(t_1'),Y(t_2'),\cdots,Y(t_m'))\) Are independent of each other , It's called a random process \(\{X(t)\}\) and \(\{Y(t)\}\) Are independent of each other .
Part 2: Numerical characteristics of two-dimensional random processes
The statistical properties of two-dimensional random processes can also be characterized by probability distribution and numerical characteristics , Because the finite dimensional distribution of two-dimensional random processes is not commonly used , And in practical problems, it is often difficult to find , Therefore, we only introduce the numerical characteristics of two-dimensional random processes . About digital features , In addition to the respective mean function and autocorrelation function of each random process , We also need to introduce cross-correlation function and cross covariance function , It is used to describe the relationship between two random processes in a two-dimensional random process .
- Cross correlation function :\(r_{XY}(s,t)={\rm E}\left[X(s)Y(t)\right]\) .
- Cross covariance function :\(C_{XY}(s,t)={\rm Cov}(X(s),Y(t))\) .
Correlation of random processes : For random processes \(\{X(t):t\in T\}\) and \(\{Y(t):t\in T\}\) , If to any \(s,t\in T\) , There are \(C_{XY}(s,t)=0\) , It's called a random process \(\{X(t)\}\) and \(\{Y(t)\}\) Unrelated .
In a general way , If the random process \(\{X(t)\}\) and \(\{Y(t)\}\) Unrelated , It cannot be deduced that they are independent of each other . But if the random process \(\{X(t)\}\) and \(\{Y(t)\}\) Are independent of each other , And random process \(\{X(t)\}\) and \(\{Y(t)\}\) They are all second-order moment processes , Then they must be irrelevant .
Last but not least , Using two random processes \(\{X(t)\}\) and \(\{Y(t)\}\) Before the cross-correlation function and cross covariance function , It is necessary to ensure that each random process itself is a second-order moment process , For each \(t\in T\) There are \({\rm E}\left[X(t)\right]^2<\infty\) and \({\rm E}\left[Y(t)\right]^2<\infty\) .