当前位置:网站首页>The fourth edition of Zhejiang University probability proves that the uncorrelation of normal distribution random variables is equivalent to independence
The fourth edition of Zhejiang University probability proves that the uncorrelation of normal distribution random variables is equivalent to independence
2022-07-24 13:58:00 【Blue field soldier】
Proof proposition :
∀ full foot just state branch cloth Of along with machine change The amount X , Y , And Its United close branch cloth also full foot just state branch cloth , Yes X , Y No phase Turn off ⇔ X , Y single state \forall Random variables satisfying normal distribution X,Y, And its joint distribution also satisfies the normal distribution , Yes X,Y Unrelated \Leftrightarrow X,Y Independent ∀ full foot just state branch cloth Of along with machine change The amount X,Y, And Its United close branch cloth also full foot just state branch cloth , Yes X,Y No phase Turn off ⇔X,Y single state
If X , Y X,Y X,Y Independent , Then there will be X , Y X,Y X,Y Unrelated
because X , Y X,Y X,Y Independent , So there are E ( X Y ) = E ( X ) E ( Y ) E(XY)=E(X)E(Y) E(XY)=E(X)E(Y), Then its covariance C o v ( X , Y ) = E ( X Y ) − E ( X ) E ( Y ) Cov(X,Y)=E(XY)-E(X)E(Y) Cov(X,Y)=E(XY)−E(X)E(Y) be equal to 0 0 0, The correlation coefficient ρ = C o v ( X , Y ) D ( X ) D ( Y ) \rho = \frac{Cov(X,Y)}{\sqrt{D(X)}\sqrt{D(Y)}} ρ=D(X)D(Y)Cov(X,Y), So the correlation coefficient is 0 0 0, So it's not relevant .Next, we need to prove the normal distribution X , Y X,Y X,Y Unrelated , Then it is independent
From the joint distribution probability density function formula of multidimensional normal distribution random variables :
f ( N → ) = 1 ( 2 π ) n 2 ∣ Σ ∣ 1 2 e − 1 2 ( N → − μ ) T Σ − 1 ( N → − μ ) f\left( \overrightarrow {N}\right) =\dfrac {1}{\left( 2\pi \right) ^{\dfrac {n}{2}}\left| \Sigma \right| ^{\dfrac {1}{2}}}e^{-\dfrac {1}{2}}\left( \overrightarrow {N}-\mu \right) ^{T} \Sigma ^{-1}\left( \overrightarrow {N}-\mu \right) f(N)=(2π)2n∣Σ∣211e−21(N−μ)TΣ−1(N−μ)
among
N → yes many individual along with machine change The amount Group become Of along with machine Arrow The amount , n That's ok 1 Column Σ yes along with machine Arrow The amount Of Association Fang Bad Moment front μ yes N → Of period at Arrow The amount , n That's ok 1 Column \begin{aligned} &\overrightarrow {N} Is a random vector composed of multiple random variables ,n That's ok 1 Column \\ &\Sigma Is the covariance matrix of random vectors \\ &\mu yes \overrightarrow {N} Expectation vector of ,n That's ok 1 Column \end{aligned} N yes many individual along with machine change The amount Group become Of along with machine Arrow The amount ,n That's ok 1 Column Σ yes along with machine Arrow The amount Of Association Fang Bad Moment front μ yes N Of period at Arrow The amount ,n That's ok 1 Column
In two-dimensional random variables , Covariance matrix is a two-dimensional matrix , It is not difficult to calculate its inverse ,
The formula of covariance matrix :
Σ = E [ ( N − E [ N ] ) ( N − E [ N ] ) T ] N = [ X , Y ] T \Sigma=\mathrm{E} \left[ \left( \mathbf{N} - \mathrm{E}[\mathbf{N}] \right) \left( \mathbf{N} - \mathrm{E}[\mathbf{N}] \right)^{\rm T} \right] \\ N=[X,Y]^T Σ=E[(N−E[N])(N−E[N])T]N=[X,Y]T
You can follow the formula
A − 1 = A ∗ ∣ A ∣ A^{-1} = \frac{A^*}{|A|} A−1=∣A∣A∗, among A ∗ A^* A∗ yes A Of Adjoint matrix , The adjoint matrix is Cofactor matrix The transpose matrix of , Each element of cofactor matrix is the original matrix A A A Corresponding to the element Algebraic cofactor , The algebraic cofactor of an element is the determinant of the matrix obtained by removing the row and column of the element . This passage involves many concepts of linear algebra . You can look at the following formula to deduce step by step .
A = [ a b c d ] A − 1 = 1 a d − b c [ d − b − c a ] \begin{aligned} A&=\begin{bmatrix} a & b \\ c & d \end{bmatrix} \\ A^{-1}&=\dfrac {1}{ad-bc}\begin{bmatrix} d & -b \\ -c & a \end{bmatrix} \end{aligned} AA−1=[acbd]=ad−bc1[d−c−ba]
So there is , A two-dimensional A random variable ( That is to say 2 That's ok 1 Column Random vectors ) The probability density function of f ( x , y ) f(x,y) f(x,y):
f ( x , y ) = 1 2 π σ 1 σ 2 1 − ρ 2 e q Its in : q = − 1 2 ( 1 − ρ 2 ) [ ( x − μ 1 ) 2 σ 1 2 − 2 ρ ( x − μ 1 ) ( y − μ 2 ) σ 1 σ 2 + ( y − μ 2 ) 2 σ 2 2 ] ρ = C o v ( X , Y ) σ 1 σ 2 σ 1 , σ 2 branch other yes X , Y Of mark accurate Bad , μ 1 , μ 2 branch other yes X , Y Of period at \begin{aligned} f\left( x,y\right) &=\dfrac {1}{2\pi \sigma _{1}\sigma _{2}\sqrt {1-\rho ^{2}}}e^{q} \\ among :\\ q&=\dfrac {-1}{2\left( 1-\rho ^{2}\right) }\left[ \dfrac {\left( x-\mu _{1}\right) ^{2}}{\sigma_1^{2}}-2\rho \dfrac {\left( x-\mu _{1}\right) \left( y-\mu _{2}\right) }{\sigma _{1}\sigma _{2}}+\dfrac {\left( y-\mu _2\right) ^{2}}{\sigma ^2_{2}}\right] \\ \rho &=\dfrac {Cov\left( X,Y\right) }{\sigma _{1}\sigma _{2}} \\ &\sigma_1,\sigma_2 Namely X,Y Standard deviation ,\mu_1,\mu_2 Namely X,Y The expectations of the \end{aligned} f(x,y) Its in :qρ=2πσ1σ21−ρ21eq=2(1−ρ2)−1[σ12(x−μ1)2−2ρσ1σ2(x−μ1)(y−μ2)+σ22(y−μ2)2]=σ1σ2Cov(X,Y)σ1,σ2 branch other yes X,Y Of mark accurate Bad ,μ1,μ2 branch other yes X,Y Of period at
and :
f ( x ) = 1 2 π σ 1 e − ( x − μ 1 ) 2 2 σ 1 2 f ( y ) = 1 2 π σ 2 e − ( y − μ 2 ) 2 2 σ 2 2 f ( x ) f ( y ) = 1 2 π σ 1 σ 2 e ( x − μ 1 ) 2 σ 1 2 + ( y − μ 2 ) 2 σ 2 2 \begin{aligned} f\left( x\right) &= \Large{\dfrac {1}{\sqrt {2\pi }\sigma_1} e}^{\small{-\dfrac {\left( x-\mu_1 \right) ^{2}}{2\sigma_1^{2}}}}\\ \\ f\left( y\right) &= \Large{\dfrac {1}{\sqrt {2\pi }\sigma_2 }e}^{\small{-\dfrac {\left( y-\mu_2 \right) ^{2}}{2\sigma_2^{2}}}}\\ \\ f(x)f(y) &= \Large{\dfrac {1}{2\pi \sigma _{1}\sigma _{2}}e}^{\small{ \dfrac {\left( x-\mu _{1}\right) ^{2}}{\sigma_1^{2}}+\dfrac {\left( y-\mu _2\right) ^{2}}{\sigma ^2_{2}}}} \\ \end{aligned}\\ f(x)f(y)f(x)f(y)=2πσ11e−2σ12(x−μ1)2=2πσ21e−2σ22(y−μ2)2=2πσ1σ21eσ12(x−μ1)2+σ22(y−μ2)2
It can be seen that , Yes ρ = 0 ⇒ f ( x , y ) = f ( x ) f ( y ) \rho=0 \Rightarrow f(x,y)=f(x)f(y) ρ=0⇒f(x,y)=f(x)f(y), That is, irrelevant can deduce independent .
Sum up 1,2, The left side of a proposition can be deduced from the right , The right side can also push out the left , So the proposition holds .
边栏推荐
猜你喜欢

uni-app 背景音频 熄屏或者退回桌面之后不在播放

使用树莓派做Apache2 HA实验

Sringboot-plugin-framework 实现可插拔插件服务

Multithreaded common classes

Network security - file upload whitelist bypass

Nmap安全测试工具使用教程

Uni app background audio will not be played after the screen is turned off or returned to the desktop

在EXCEL表格中如何进行快速换行

天然气潮流计算matlab程序

CSDN垃圾的没有底线!
随机推荐
Default color setting in uiswitch off state
数据修改插入
天然气潮流计算matlab程序
R language test sample proportion: use the prop.test function to perform a single sample proportion test to calculate the confidence interval of the p value of the successful sample proportion in the
Data Lake series articles
通配符(泛域名)SSL证书
Data modification and insertion
rhce第一次作业
三层交换机配置MSTP协议详解【华为eNSP实验】
CSDN垃圾的没有底线!
JS execution mechanism
OWASP ZAP安全测试工具使用教程(高级)
Simple order management system small exercise
R language uses the sum function of epidisplay package to calculate the descriptive statistical summary information of the specified variables in dataframe under different grouping variables, visualiz
Network security - error injection
The KAP function of epidisplay package in R language calculates the value of kappa statistics (total consistency, expected consistency), analyzes the consistency of the results of multiple scoring obj
Network security - war backdoor deployment
Afnetworking data raw request mode
Rhcsa sixth note
Flink综合案例(九)