当前位置:网站首页>[the Nine Yang Manual] 2022 Fudan University Applied Statistics real problem + analysis
[the Nine Yang Manual] 2022 Fudan University Applied Statistics real problem + analysis
2022-07-06 13:31:00 【Elder martial brother statistics】
Catalog
The real part
One 、(20 branch ) There is... In the bag : a a a Red ball , a a a yellow ball , b b b blue ball . Yes, put it back and touch it 3 A ball , set up A = { A=\{ A={ Draw out a yellow ball and a red ball , And the red ball is taken out before the yellow ball } \} }. seek :
(1)(10 branch ) P ( A ) P(A) P(A);
(2)(10 branch ) if { \{ { I didn't touch the basketball } \} } And A A A The probability is the same , seek a : b a:b a:b.
Two 、(10 branch ) Discrete random variables X X X The distribution column of is
P ( X = a ) = P ( X = b ) = P ( X = a + 1 ) = 1 3 , P(X=a)=P(X=b)=P(X=a+1)=\frac{1}{3}, P(X=a)=P(X=b)=P(X=a+1)=31, among a < b < a + 1 a<b<a+1 a<b<a+1. Find the value range of its variance .
3、 ... and 、(20 branch ) Discrete random variables X X X Take only x , x + a x,x+a x,x+a Two values , among a > 0 a>0 a>0, And V a r ( X ) = 1 Var(X)=1 Var(X)=1, seek a a a Value range and X X X The distribution column of .
Four 、(20 branch ) With random vectors ( X Y ) \left( \begin{array}{c} X\\ Y\\ \end{array} \right) (XY), It is known that after arbitrary rotation transformation ( cos α sin α − sin α cos α ) ( X Y ) \left( \begin{matrix} \cos \alpha& \sin \alpha\\ -\sin \alpha& \cos \alpha\\ \end{matrix} \right) \left( \begin{array}{c} X\\ Y\\ \end{array} \right) (cosα−sinαsinαcosα)(XY) Still with ( X Y ) \left( \begin{array}{c} X\\ Y\\ \end{array} \right) (XY) Homodistribution , Try to solve the following problems :
(1) seek P ( 0 < Y < X ) P(0<Y<X) P(0<Y<X);
(2) seek Y X \frac{Y}{X} XY The distribution of .
5、 ... and 、(20 branch ) It is known that ( X , Y ) ∼ N ( 0 , 0 ; 1 , 1 ; 1 2 ) (X,Y)\sim N(0,0;1,1;\frac{1}{2}) (X,Y)∼N(0,0;1,1;21), seek P ( X > 0 , Y > 0 ) P(X>0,Y>0) P(X>0,Y>0).
6、 ... and 、(10 branch ) X 1 , ⋯ , X n , ⋯ X_1,\cdots,X_n,\cdots X1,⋯,Xn,⋯ yes i.i.d. There are random variables in the second moment of , Y n = ∑ i = 1 n X i Y_n = \sum_{i=1}^n X_i Yn=∑i=1nXi, ask : { Y n n 2 } \{\frac{Y_n}{n^2}\} { n2Yn} Obey the law of large numbers .
7、 ... and 、(10 branch ) X 1 , ⋯ , X n X_1,\cdots,X_n X1,⋯,Xn yes i.i.d. obey N ( μ , σ 2 ) N(\mu,\sigma^2) N(μ,σ2) Random variable of , F F F Is its distribution function , seek − 2 ∑ i = 1 n ln F ( X i ) -2\sum_{i=1}^n \ln F(X_i) −2∑i=1nlnF(Xi) The distribution of .
8、 ... and 、(10 branch ) X 1 , ⋯ , X 6 X_1,\cdots,X_6 X1,⋯,X6 yes i.i.d. Of U ( 0 , 1 ) U(0,1) U(0,1) A random variable , seek V a r ( 2 X ( 2 ) + 3 X ( 3 ) ) Var(2X_{(2)}+3X_{(3)}) Var(2X(2)+3X(3)).
Nine 、(10 branch ) X 1 , ⋯ , X n X_1,\cdots,X_n X1,⋯,Xni yes i.i.d. Of U ( 0 , θ ) U(0,\theta) U(0,θ) A random sample , set up a X ( 1 ) , b X ( 3 ) aX_{(1)},bX_{(3)} aX(1),bX(3) yes θ \theta θ Unbiased estimation of , seek a , b a,b a,b And compare which of them is more effective .
Ten 、(10 branch ) set up X 1 , ⋯ , X n X_1,\cdots,X_n X1,⋯,Xn yes i.i.d. Of N ( μ , 16 ) N(\mu,16) N(μ,16) A random sample , μ \mu μ The prior distribution of is N ( a , b 2 ) N(a,b^2) N(a,b2), Find the posterior distribution .
11、 ... and 、(10 branch ) set up X 1 , ⋯ , X n X_1,\cdots,X_n X1,⋯,Xn yes i.i.d. Of U ( 0 , θ ) U(0,\theta) U(0,θ) A random sample , Consider the hypothesis test problem
H 0 : θ ≤ 1 v s H 1 : θ > 1 H_0:\theta \le 1 \quad \mathrm{vs} \quad H_1: \theta >1 H0:θ≤1vsH1:θ>1 Construct reject fields W = { X ( n ) ≥ c } W=\{X_{(n)}\ge c \} W={ X(n)≥c}. Answer the following questions :
(1)(5 branch ) α = 0.05 \alpha = 0.05 α=0.05, seek c c c;
(2)(5 branch ) When θ = 1.5 \theta=1.5 θ=1.5, In order to make the probability of making the second kind of mistake β ≤ 0.1 \beta\le 0.1 β≤0.1, Find the minimum sample size .
The analysis part
One 、(20 branch ) There is... In the bag : a a a Red ball , a a a yellow ball , b b b blue ball . Yes, put it back and touch it 3 A ball , set up A = { A=\{ A={ Draw out a yellow ball and a red ball , And the red ball is taken out before the yellow ball } \} }. seek :
(1)(10 branch ) P ( A ) P(A) P(A);
(2)(10 branch ) if { \{ { I didn't touch the basketball } \} } And A A A The probability is the same , seek a : b a:b a:b.
Solution:
[ notes ]: The stem of the question can be understood as A 1 = { A_1=\{ A1={ A red ball is taken out before a yellow one } \} }, It can also be interpreted as A 2 = { A_2=\{ A2={ All red balls are taken out before yellow balls } \} }.
(1) Think about it first A 1 A_1 A1, Yes
A 1 = { Red red yellow , Red, yellow, red , Red, yellow, yellow , Blue red yellow , Red blue yellow , Red, yellow, blue } , A_1=\left\{ \text{ Red red yellow }, \text{ Red, yellow, red }, \text{ Red, yellow, yellow }, \text{ Blue red yellow }, \text{ Red blue yellow }, \text{ Red, yellow, blue } \right\} , A1={ Red red yellow , Red, yellow, red , Red, yellow, yellow , Blue red yellow , Red blue yellow , Red, yellow, blue }, There are
P ( A 1 ) = a 3 + a 3 + a 3 + 3 a 2 b ( 2 a + b ) 3 = 3 a 2 ( a + b ) ( 2 a + b ) 3 . P\left( A_1 \right) =\frac{a^3+a^3+a^3+3a^2b}{\left( 2a+b \right) ^3}=\frac{3a^2\left( a+b \right)}{\left( 2a+b \right) ^3}. P(A1)=(2a+b)3a3+a3+a3+3a2b=(2a+b)33a2(a+b). Think again A 2 A_2 A2, Yes
A 2 = { Red red yellow , Red, yellow, yellow , Blue red yellow , Red blue yellow , Red, yellow, blue } , A_2=\left\{ \text{ Red red yellow }, \text{ Red, yellow, yellow }, \text{ Blue red yellow }, \text{ Red blue yellow }, \text{ Red, yellow, blue } \right\} , A2={ Red red yellow , Red, yellow, yellow , Blue red yellow , Red blue yellow , Red, yellow, blue }, There are
P ( A 2 ) = a 3 + a 3 + 3 a 2 b ( 2 a + b ) 3 = a 2 ( 2 a + 3 b ) ( 2 a + b ) 3 . P\left( A_2 \right) =\frac{a^3+a^3+3a^2b}{\left( 2a+b \right) ^3}=\frac{a^2\left( 2a+3b \right)}{\left( 2a+b \right) ^3}. P(A2)=(2a+b)3a3+a3+3a2b=(2a+b)3a2(2a+3b). (2) To calculate B = { B=\{ B={ I didn't touch the basketball } \} }, Yes
P ( B ) = ( 2 a 2 a + b ) 3 = 8 a 3 ( 2 a + b ) 3 , P\left( B \right) =\left( \frac{2a}{2a+b} \right) ^3=\frac{8a^3}{\left( 2a+b \right) ^3}, P(B)=(2a+b2a)3=(2a+b)38a3, if P ( B ) = P ( A 1 ) P(B)=P(A_1) P(B)=P(A1), namely
8 a 3 = 3 a 2 ( a + b ) * 8 a = 3 ( a + b ) * a b = 3 5 . 8a^3=3a^2\left( a+b \right) \,\,\Longrightarrow \,\,8a=3\left( a+b \right) \,\,\Longrightarrow \frac{a}{b}=\frac{3}{5}. 8a3=3a2(a+b)*8a=3(a+b)*ba=53. if P ( B ) = P ( A 2 ) P(B)=P(A_2) P(B)=P(A2), namely
8 a 3 = a 2 ( 2 a + 3 b ) * 8 a = 2 a + 3 b * a b = 1 2 . 8a^3=a^2\left( 2a+3b \right) \,\,\Longrightarrow \,\,8a=2a+3b\,\,\Longrightarrow \frac{a}{b}=\frac{1}{2}. 8a3=a2(2a+3b)*8a=2a+3b*ba=21.
Two 、(10 branch ) Discrete random variables X X X The distribution column of is
P ( X = a ) = P ( X = b ) = P ( X = a + 1 ) = 1 3 , P(X=a)=P(X=b)=P(X=a+1)=\frac{1}{3}, P(X=a)=P(X=b)=P(X=a+1)=31, among a < b < a + 1 a<b<a+1 a<b<a+1. Find the value range of its variance .
Solution:
because V a r ( X ) = V a r ( X − a ) Var(X)=Var(X-a) Var(X)=Var(X−a), So suppose X X X The value is
P ( X = 0 ) = P ( X = c ) = P ( X = 1 ) = 1 3 , P(X=0)=P(X=c)=P(X=1)=\frac{1}{3}, P(X=0)=P(X=c)=P(X=1)=31, among c = b − a ∈ ( 0 , 1 ) c=b-a\in (0,1) c=b−a∈(0,1), There are E X = c + 1 3 EX=\frac{c+1}{3} EX=3c+1, and
E X 2 = c 2 + 1 3 , V a r ( X ) = 3 c 2 + 3 − ( c + 1 ) 2 9 = 2 9 [ ( c − 1 2 ) 2 + 3 4 ] ∈ [ 1 6 , 2 9 ) . EX^2 = \frac{c^2+1}{3},\quad Var(X)=\frac{3c^2+3-(c+1)^2}{9}=\frac{2}{9}\left[ \left( c-\frac{1}{2} \right) ^2+\frac{3}{4} \right] \in \left[ \frac{1}{6},\frac{2}{9} \right) . EX2=3c2+1,Var(X)=93c2+3−(c+1)2=92[(c−21)2+43]∈[61,92).
3、 ... and 、(20 branch ) Discrete random variables X X X Take only x , x + a x,x+a x,x+a Two values , among a > 0 a>0 a>0, And V a r ( X ) = 1 Var(X)=1 Var(X)=1, seek a a a Value range and X X X The distribution column of .
Solution:
The question only gives the condition of variance , and V a r ( X ) = V a r ( X − x ) Var(X)=Var(X-x) Var(X)=Var(X−x), So let's assume X X X Take only 0 , a 0,a 0,a Two values , And
V a r ( X ) = a 2 p − a 2 p 2 = a 2 p ( 1 − p ) = 1 , Var\left( X \right) =a^2p-a^2p^2=a^2p\left( 1-p \right) =1, Var(X)=a2p−a2p2=a2p(1−p)=1, among p = P ( X = a ) p=P(X=a) p=P(X=a), because p ( 1 − p ) ≤ 1 4 p(1-p)\le \frac{1}{4} p(1−p)≤41, therefore a ≥ 2 a\ge 2 a≥2. And when a a a Timing , p p p It can be solved , namely
p 2 − p + 1 a = 0 * p = 1 ± 1 − 4 a 2 . p^2-p+\frac{1}{a}=0 \Longrightarrow p=\frac{1\pm \sqrt{1-\frac{4}{a}}}{2}. p2−p+a1=0*p=21±1−a4. therefore X X X The distribution column of is
P ( X = x ) = 1 + 1 − 4 a 2 , P ( X = x + a ) = 1 − 1 − 4 a 2 , P\left( X=x \right) =\frac{1+\sqrt{1-\frac{4}{a}}}{2},\quad P\left( X=x+a \right) =\frac{1-\sqrt{1-\frac{4}{a}}}{2}, P(X=x)=21+1−a4,P(X=x+a)=21−1−a4, Or is it
P ( X = x ) = 1 − 1 − 4 a 2 , P ( X = x + a ) = 1 + 1 − 4 a 2 . P\left( X=x \right) =\frac{1-\sqrt{1-\frac{4}{a}}}{2},\quad P\left( X=x+a \right) =\frac{1+\sqrt{1-\frac{4}{a}}}{2}. P(X=x)=21−1−a4,P(X=x+a)=21+1−a4.
Four 、(20 branch ) With random vectors ( X Y ) \left( \begin{array}{c} X\\ Y\\ \end{array} \right) (XY), It is known that after arbitrary rotation transformation ( cos α sin α − sin α cos α ) ( X Y ) \left( \begin{matrix} \cos \alpha& \sin \alpha\\ -\sin \alpha& \cos \alpha\\ \end{matrix} \right) \left( \begin{array}{c} X\\ Y\\ \end{array} \right) (cosα−sinαsinαcosα)(XY) Still with ( X Y ) \left( \begin{array}{c} X\\ Y\\ \end{array} \right) (XY) Homodistribution , Try to solve the following problems :
(1) seek P ( 0 < Y < X ) P(0<Y<X) P(0<Y<X);
(2) seek Y X \frac{Y}{X} XY The distribution of .
Solution:
(1) set up X , Y X,Y X,Y The density function of is f X , Y ( x , y ) f_{X,Y}(x,y) fX,Y(x,y), For rotation transformation ( U , V ) = ( X , Y ) A T (U,V)=(X,Y)A^T (U,V)=(X,Y)AT, By the method of variable transformation , Yes
f U , V ( u , v ) = f X , Y ( ( u , v ) A ) ∣ A ∣ = f X , Y ( ( u , v ) A ) , f_{U,V}\left( u,v \right) =f_{X,Y}\left( \left( u,v \right) A \right) \left| A \right|=f_{X,Y}\left( \left( u,v \right) A \right) , fU,V(u,v)=fX,Y((u,v)A)∣A∣=fX,Y((u,v)A), On the other hand , because U , V U,V U,V And X , Y X,Y X,Y Homodistribution , so f U , V ( u , v ) = f X , Y ( u , v ) f_{U,V}\left( u,v \right) =f_{X,Y}\left( u,v \right) fU,V(u,v)=fX,Y(u,v), in summary , Yes
f X , Y ( x , y ) = f X , Y ( ( x , y ) A ) , f_{X,Y}\left( x,y \right) =f_{X,Y}\left( \left( x,y \right) A \right) , fX,Y(x,y)=fX,Y((x,y)A), Transform for any rotation A A A establish , This indicates that there is a function u u u bring f X , Y ( x , y ) = u ( x 2 + y 2 ) f_{X,Y}(x,y)=u(\sqrt{x^2+y^2}) fX,Y(x,y)=u(x2+y2). Make polar coordinate transformation { X = R cos Θ , Y = R sin Θ , \begin{cases} X=R\cos \Theta ,\\ Y=R\sin \Theta ,\\ \end{cases} { X=RcosΘ,Y=RsinΘ, By the method of variable transformation , Yes ( R , Θ ) (R,\Theta) (R,Θ) The distribution of is
f R , Θ ( r , θ ) = f X , Y ( r cos θ , r sin θ ) r = r u ( r ) , r ∈ ( 0 , + ∞ ) , θ ∈ ( 0 , 2 π ) , f_{R,\Theta}\left( r,\theta \right) =f_{X,Y}\left( r\cos \theta ,r\sin \theta \right) r=ru\left( r \right) ,\quad r\in \left( 0,+\infty \right) ,\theta \in \left( 0,2\pi \right) , fR,Θ(r,θ)=fX,Y(rcosθ,rsinθ)r=ru(r),r∈(0,+∞),θ∈(0,2π), Factorization , therefore R , Θ R,\Theta R,Θ Independent , And f Θ ( θ ) f_{\Theta}(\theta) fΘ(θ) Is constant , so Θ ∼ U ( 0 , 2 π ) \Theta \sim U(0,2\pi) Θ∼U(0,2π). therefore P ( 0 < Y < X ) = P ( Θ ∈ ( 0 , π 4 ) ) = 1 8 . P\left( 0<Y<X \right) =P\left( \Theta \in \left( 0,\frac{\pi}{4} \right) \right) =\frac{1}{8}. P(0<Y<X)=P(Θ∈(0,4π))=81. (2) T = Y X = tan Θ ∼ c h ( 0 , 1 ) T=\frac{Y}{X}=\tan \Theta \sim \mathrm{ch}\left( 0,1 \right) T=XY=tanΘ∼ch(0,1), Standard Cauchy distribution , The distribution function method can be used to explain : Pay attention here Θ \Theta Θ The value is ( 0 , 2 π ) (0,2\pi) (0,2π), Not on the inverse function arctan \arctan arctan The domain of definition , Discuss carefully . To any t > 0 t>0 t>0, Yes
{ T ≤ t } = { tan Θ ≤ t } = { Θ ∈ [ 0 , a r c tan t ] ∪ ( π 2 , a r c tan t + π ] ∪ ( 3 π 2 , 2 π ] } , \begin{aligned} \left\{ T\le t \right\} &=\left\{ \tan \Theta \le t \right\}\\ &=\left\{ \Theta \in \left[ 0,\mathrm{arc}\tan t \right] \cup \left( \frac{\pi}{2},\mathrm{arc}\tan t+\pi \right] \cup \left( \frac{3\pi}{2},2\pi \right] \right\}\\ \end{aligned}, { T≤t}={ tanΘ≤t}={ Θ∈[0,arctant]∪(2π,arctant+π]∪(23π,2π]}, so P ( T ≤ t ) = π + 2 arctan t 2 π = 1 2 + arctan t π P(T\le t)= \frac{\pi + 2\arctan t}{2\pi}=\frac{1}{2} + \frac{\arctan t}{\pi} P(T≤t)=2ππ+2arctant=21+πarctant, t > 0 t>0 t>0. Discuss again about any t < 0 t<0 t<0, Yes
{ T ≤ t } = { tan Θ ≤ t } = { Θ ∈ ( π 2 , a r c tan t + π ] ∪ ( 3 π 2 , a r c tan t + 2 π ) } \begin{aligned} \left\{ T\le t \right\} &=\left\{ \tan \Theta \le t \right\}\\ &=\left\{ \Theta \in \left( \frac{\pi}{2},\mathrm{arc}\tan t+\pi \right] \cup \left( \frac{3\pi}{2},\mathrm{arc}\tan t+2\pi \right) \right\}\\ \end{aligned} { T≤t}={ tanΘ≤t}={ Θ∈(2π,arctant+π]∪(23π,arctant+2π)} so P ( T ≤ t ) = π + 2 arctan t 2 π = 1 2 + arctan t π P(T\le t)= \frac{\pi + 2\arctan t}{2\pi}=\frac{1}{2} + \frac{\arctan t}{\pi} P(T≤t)=2ππ+2arctant=21+πarctant, t < 0 t<0 t<0. To sum up, there are
F T ( t ) = 1 2 + a r c tan t π , t ∈ R , F_T\left( t \right) =\frac{1}{2}+\frac{\mathrm{arc}\tan t}{\pi},\quad t\in R, FT(t)=21+πarctant,t∈R, To derive
f T ( t ) = 1 π ( 1 + t 2 ) , t ∈ R , f_T\left( t \right) =\frac{1}{\pi \left( 1+t^2 \right)},\quad t\in R, fT(t)=π(1+t2)1,t∈R, This is the standard Cauchy distribution .
5、 ... and 、(20 branch ) It is known that ( X , Y ) ∼ N ( 0 , 0 ; 1 , 1 ; 1 2 ) (X,Y)\sim N(0,0;1,1;\frac{1}{2}) (X,Y)∼N(0,0;1,1;21), seek P ( X > 0 , Y > 0 ) P(X>0,Y>0) P(X>0,Y>0).
Solution:
Make W = 2 3 ( Y − 1 2 X ) W=\frac{2}{\sqrt{3}}(Y-\frac{1}{2}X) W=32(Y−21X), Then there are E W = 0 EW=0 EW=0, V a r ( W ) = 1 Var(W)=1 Var(W)=1, And C o v ( X , W ) = 0 Cov(X,W)=0 Cov(X,W)=0, There are X , W X,W X,W The independent identity obeys the standard normal distribution . Further consider
P ( X > 0 , Y > 0 ) = P ( − X > 0 , − Y > 0 ) = P ( X < 0 , Y < 0 ) , P\left( X>0,Y>0 \right) =P\left( -X>0,-Y>0 \right) =P\left( X<0,Y<0 \right) , P(X>0,Y>0)=P(−X>0,−Y>0)=P(X<0,Y<0), Find out
P ( X > 0 , Y > 0 ) = 1 2 P ( X Y > 0 ) = 1 2 P ( Y X > 0 ) , P\left( X>0,Y>0 \right) =\frac{1}{2}P\left( XY>0 \right) =\frac{1}{2}P\left( \frac{Y}{X}>0 \right) , P(X>0,Y>0)=21P(XY>0)=21P(XY>0), recycling W W W To deal with , namely
{ Y X > 0 } = { 3 W 2 + 1 2 X X > 0 } = { W X > − 1 3 } . \left\{ \frac{Y}{X}>0 \right\} =\left\{ \frac{\frac{\sqrt{3}W}{2}+\frac{1}{2}X}{X}>0 \right\} =\left\{ \frac{W}{X}>-\frac{1}{\sqrt{3}}\right\} . { XY>0}={ X23W+21X>0}={ XW>−31}. utilize W X \frac{W}{X} XW Obey the standard Cauchy distribution , Yes P ( X > 0 , Y > 0 ) = 1 2 ∫ − 3 3 + ∞ 1 π ( 1 + t 2 ) d t = 1 3 . P\left( X>0,Y>0 \right) =\frac{1}{2}\int_{-\frac{\sqrt{3}}{3}}^{+\infty}{\frac{1}{\pi \left( 1+t^2 \right)}dt}=\frac{1}{3}. P(X>0,Y>0)=21∫−33+∞π(1+t2)1dt=31.
6、 ... and 、(10 branch ) X 1 , ⋯ , X n , ⋯ X_1,\cdots,X_n,\cdots X1,⋯,Xn,⋯ yes i.i.d. There are random variables in the second moment of , Y n = ∑ i = 1 n X i Y_n = \sum_{i=1}^n X_i Yn=∑i=1nXi, ask : { Y n n 2 } \{\frac{Y_n}{n^2}\} { n2Yn} Obey the law of large numbers .
Solution:
[ Law 1 ]: Make Z n = Y n n 2 Z_n = \frac{Y_n}{n^2} Zn=n2Yn, Calculate covariance directly , First of all, there is
C o v ( Y k , Y k + l ) = C o v ( ∑ j = 1 k X j , ∑ j = 1 k + l X j ) = k V a r ( X 1 ) , Cov\left( Y_k,Y_{k+l} \right) =Cov\left( \sum_{j=1}^k{X_j},\sum_{j=1}^{k+l}{X_j} \right) =kVar\left( X_1 \right) , Cov(Yk,Yk+l)=Cov(j=1∑kXj,j=1∑k+lXj)=kVar(X1), Further, there are , When l → ∞ l\rightarrow \infty l→∞, Yes
C o v ( Z k , Z k + l ) = 1 k 2 ( k + l ) 2 C o v ( Y k , Y l ) = V a r ( X 1 ) k ( k + l ) 2 → 0 , Cov\left( Z_k,Z_{k+l} \right) =\frac{1}{k^2\left( k+l \right) ^2}Cov\left( Y_k,Y_l \right) =\frac{Var\left( X_1 \right)}{k\left( k+l \right) ^2}\rightarrow 0, Cov(Zk,Zk+l)=k2(k+l)21Cov(Yk,Yl)=k(k+l)2Var(X1)→0, By Bernstein condition , { Y n n 2 } \{\frac{Y_n}{n^2}\} { n2Yn} Obey the law of large numbers .
[ Law two ]: By strong law of numbers , Yes Z n = 1 n ⋅ Y n n → 0 ⋅ E X 1 = 0 Z_n=\frac{1}{n}\cdot \frac{Y_n}{n}\rightarrow 0\cdot EX_1=0 Zn=n1⋅nYn→0⋅EX1=0, a.s., from stolz Theorem , Yes lim n → ∞ ∑ k = 1 n Z k n = lim n → ∞ Z n = 0 , a.s. \underset{n\rightarrow \infty}{\lim}\frac{\sum_{k=1}^n{Z_k}}{n}=\underset{n\rightarrow \infty}{\lim}Z_n=0, \text{a.s.} n→∞limn∑k=1nZk=n→∞limZn=0,a.s.
7、 ... and 、(10 branch ) X 1 , ⋯ , X n X_1,\cdots,X_n X1,⋯,Xn yes i.i.d. obey N ( μ , σ 2 ) N(\mu,\sigma^2) N(μ,σ2) Random variable of , F F F Is its distribution function , seek − 2 ∑ i = 1 n ln F ( X i ) -2\sum_{i=1}^n \ln F(X_i) −2∑i=1nlnF(Xi) The distribution of .
Solution:
First of all Y i = F ( X i ) ∼ U ( 0 , 1 ) Y_i = F(X_i)\sim U(0,1) Yi=F(Xi)∼U(0,1), Just calculate Z 1 = − 2 Y 1 Z_1=-2Y_1 Z1=−2Y1 The distribution of , By the distribution function method , Yes z > 0 z>0 z>0, Yes
P ( Z 1 ≤ z ) = P ( − 2 ln Y 1 ≤ z ) = P ( Y 1 ≥ e − 2 z ) = 1 − e − 2 z , P\left( Z_1\le z \right) =P\left( -2\ln Y_1\le z \right) =P\left( Y_1\ge e^{-2z} \right) =1-e^{-2z}, P(Z1≤z)=P(−2lnY1≤z)=P(Y1≥e−2z)=1−e−2z, This is an average of 1 / 2 1/2 1/2 The exponential distribution of , It's also χ 2 ( 2 ) \chi^2(2) χ2(2) Distribution , By additivity , have to
− 2 ∑ i = 1 n ln F ( X i ) ∼ χ 2 ( 2 n ) . -2\sum_{i=1}^n \ln F(X_i)\sim \chi^2(2n). −2i=1∑nlnF(Xi)∼χ2(2n).
8、 ... and 、(10 branch ) X 1 , ⋯ , X 6 X_1,\cdots,X_6 X1,⋯,X6 yes i.i.d. Of U ( 0 , 1 ) U(0,1) U(0,1) A random variable , seek V a r ( 2 X ( 2 ) + 3 X ( 3 ) ) Var(2X_{(2)}+3X_{(3)}) Var(2X(2)+3X(3)).
Solution:
Direct calculation , Yes
V a r ( 2 X ( 2 ) + 3 X ( 3 ) ) = 4 V a r ( X ( 2 ) ) + 9 V a r ( X ( 3 ) ) + 12 C o v ( X ( 2 ) , X ( 3 ) ) , Var\left( 2X_{\left( 2 \right)}+3X_{\left( 3 \right)} \right) =4Var\left( X_{\left( 2 \right)} \right) +9Var\left( X_{\left( 3 \right)} \right) +12Cov\left( X_{\left( 2 \right)},X_{\left( 3 \right)} \right) , Var(2X(2)+3X(3))=4Var(X(2))+9Var(X(3))+12Cov(X(2),X(3)), And marginal distribution X ( 2 ) ∼ B e t a ( 2 , 5 ) X_{(2)}\sim Beta(2,5) X(2)∼Beta(2,5), X ( 3 ) ∼ B e t a ( 3 , 4 ) X_{(3)}\sim Beta(3,4) X(3)∼Beta(3,4), Therefore, the two variance terms can be calculated directly , namely
4 V a r ( X ( 2 ) ) = 4 ⋅ 10 7 2 ⋅ 8 = 10 98 , 9 V a r ( X ( 3 ) ) = 9 ⋅ 12 7 2 ⋅ 8 = 27 98 , 4Var\left( X_{\left( 2 \right)} \right) =\frac{4\cdot 10}{7^2\cdot 8}=\frac{10}{98},\quad 9Var\left( X_{\left( 3 \right)} \right) =\frac{9\cdot 12}{7^2\cdot 8}=\frac{27}{98}, 4Var(X(2))=72⋅84⋅10=9810,9Var(X(3))=72⋅89⋅12=9827, Covariance term can be written in formula i ( n + 1 − j ) ( n + 1 ) 2 ( n + 2 ) \frac{i(n+1-j)}{(n+1)^2(n+2)} (n+1)2(n+2)i(n+1−j)(2019 Fudan Yingtong seventh question ), You can also write the joint density first
g 2 , 3 ( x , y ) = 6 ! 1 ! 1 ! 1 ! 3 ! x ⋅ ( 1 − y ) 3 = 6 ! 1 ! 3 ! x ( 1 − y ) 3 , 0 < x < y < 1 , g_{2,3}\left( x,y \right) =\frac{6!}{1!1!1!3!}x\cdot \left( 1-y \right) ^3=\frac{6!}{1!3!}x\left( 1-y \right) ^3,\quad 0<x<y<1, g2,3(x,y)=1!1!1!3!6!x⋅(1−y)3=1!3!6!x(1−y)3,0<x<y<1, Calculate the mixing moment , namely
E ( X ( 2 ) X ( 3 ) ) = 6 ! 3 ! ∫ 0 1 ∫ 0 y x 2 y ( 1 − y ) 3 d x d y = 6 ! 3 ! 3 ∫ 0 1 y 4 ( 1 − y ) 3 d y = 6 ! 3 ! 4 ! 3 ! 8 ! 3 = 8 56 . \begin{aligned} E\left( X_{\left( 2 \right)}X_{\left( 3 \right)} \right) &=\frac{6!}{3!}\int_0^1{\int_0^y{x^2y\left( 1-y \right) ^3dx}dy}\\ &=\frac{6!}{3!3}\int_0^1{y^4\left( 1-y \right) ^3dy}\\ &=\frac{6!3!4!}{3!8!3}=\frac{8}{56}.\\ \end{aligned} E(X(2)X(3))=3!6!∫01∫0yx2y(1−y)3dxdy=3!36!∫01y4(1−y)3dy=3!8!36!3!4!=568. So the covariance is C o v ( X ( 2 ) , X ( 3 ) ) = 8 56 − 6 49 = 2 98 . Cov\left( X_{\left( 2 \right)},X_{\left( 3 \right)} \right) =\frac{8}{56}-\frac{6}{49}=\frac{2}{98}. Cov(X(2),X(3))=568−496=982. Sum up all the calculation results to
V a r ( 2 X ( 2 ) + 3 X ( 3 ) ) = 61 98 . Var\left( 2X_{\left( 2 \right)}+3X_{\left( 3 \right)} \right) =\frac{61}{98}. Var(2X(2)+3X(3))=9861.
Nine 、(10 branch ) X 1 , ⋯ , X n X_1,\cdots,X_n X1,⋯,Xn yes i.i.d. Of U ( 0 , θ ) U(0,\theta) U(0,θ) A random sample , set up a X ( 1 ) , b X ( 3 ) aX_{(1)},bX_{(3)} aX(1),bX(3) yes θ \theta θ Unbiased estimation of , seek a , b a,b a,b And compare which of them is more effective .
Solution:
because X ( 1 ) θ ∼ B e t a ( 1 , 3 ) \frac{X_{(1)}}{\theta}\sim Beta(1,3) θX(1)∼Beta(1,3), so E X ( 1 ) = 1 4 θ EX_{(1)}=\frac{1}{4}\theta EX(1)=41θ, V a r ( X ( 1 ) ) = 3 80 θ 2 Var(X_{(1)})=\frac{3}{80}\theta^2 Var(X(1))=803θ2, so a = 4 a=4 a=4, And V a r ( a X ( 1 ) ) = 3 5 θ 2 Var(aX_{(1)})=\frac{3}{5}\theta^2 Var(aX(1))=53θ2. Empathy X ( 3 ) ∼ B e t a ( 3 , 1 ) X_{(3)}\sim Beta(3,1) X(3)∼Beta(3,1), so E X ( 3 ) = 3 4 θ EX_{(3)}=\frac{3}{4}\theta EX(3)=43θ, V a r ( X ( 3 ) ) = 3 80 θ 2 Var(X_{(3)})=\frac{3}{80}\theta^2 Var(X(3))=803θ2, so b = 4 3 b=\frac{4}{3} b=34, And V a r ( b X ( 3 ) ) = 1 60 θ 2 Var(bX_{(3)})=\frac{1}{60}\theta^2 Var(bX(3))=601θ2. It can be seen that b X ( 3 ) bX_{(3)} bX(3) More effective .
Ten 、(10 branch ) set up X 1 , ⋯ , X n X_1,\cdots,X_n X1,⋯,Xn yes i.i.d. Of N ( μ , 16 ) N(\mu,16) N(μ,16) A random sample , μ \mu μ The prior distribution of is N ( a , b 2 ) N(a,b^2) N(a,b2), Find the posterior distribution .
Solution:
Consider sufficient statistics X ˉ ∣ μ ∼ N ( μ , 16 n ) \bar{X}|\mu \sim N(\mu,\frac{16}{n}) Xˉ∣μ∼N(μ,n16), The joint density is
p ( x , μ ) = p ( x ∣ μ ) π ( π ) = C ⋅ e − ( x − μ ) 2 2 ⋅ 16 n ⋅ e − ( μ − a ) 2 2 b 2 = C ⋅ e − ( x − μ ) 2 2 ⋅ 16 n ⋅ e − ( μ − a ) 2 2 b 2 = C e − b 2 ( x − μ ) 2 + 16 n ( μ − a ) 2 2 ⋅ 16 n b 2 = C e − b 2 ( x − μ ) 2 + 16 n ( μ − a ) 2 2 ⋅ 16 n b 2 = C e − ( 16 n + b 2 ) μ 2 − 2 ( b 2 x + 16 n a ) μ + ( b 2 x 2 + 16 n a 2 ) 2 ⋅ 16 n b 2 = C 1 ( x ) e − ( μ − b 2 x + 16 n a b 2 + 16 n ) 2 2 ⋅ 16 n b 2 16 n + b 2 , \begin{aligned} p\left( x,\mu \right) &=p\left( x|\mu \right) \pi \left( \pi \right) =C\cdot e^{-\frac{\left( x-\mu \right) ^2}{2\cdot \frac{16}{n}}}\cdot e^{-\frac{\left( \mu -a \right) ^2}{2b^2}}\\ &=C\cdot e^{-\frac{\left( x-\mu \right) ^2}{2\cdot \frac{16}{n}}}\cdot e^{-\frac{\left( \mu -a \right) ^2}{2b^2}}\\ &=Ce^{-\frac{b^2\left( x-\mu \right) ^2+\frac{16}{n}\left( \mu -a \right) ^2}{2\cdot \frac{16}{n}b^2}}\\ &=Ce^{-\frac{b^2\left( x-\mu \right) ^2+\frac{16}{n}\left( \mu -a \right) ^2}{2\cdot \frac{16}{n}b^2}}\\ &=Ce^{-\frac{\left( \frac{16}{n}+b^2 \right) \mu ^2-2\left( b^2x+\frac{16}{n}a \right) \mu +\left( b^2x^2+\frac{16}{n}a^2 \right)}{2\cdot \frac{16}{n}b^2}}\\ &=C_1(x)e^{-\frac{\left( \mu -\frac{b^2x+\frac{16}{n}a}{b^2+\frac{16}{n}} \right) ^2}{2\cdot \frac{\frac{16}{n}b^2}{\frac{16}{n}+b^2}}},\\ \end{aligned} p(x,μ)=p(x∣μ)π(π)=C⋅e−2⋅n16(x−μ)2⋅e−2b2(μ−a)2=C⋅e−2⋅n16(x−μ)2⋅e−2b2(μ−a)2=Ce−2⋅n16b2b2(x−μ)2+n16(μ−a)2=Ce−2⋅n16b2b2(x−μ)2+n16(μ−a)2=Ce−2⋅n16b2(n16+b2)μ2−2(b2x+n16a)μ+(b2x2+n16a2)=C1(x)e−2⋅n16+b2n16b2(μ−b2+n16b2x+n16a)2, It is found that there is a normally distributed kernel , So the posterior distribution is
μ ∣ X ˉ ∼ N ( b 2 X ˉ + 16 n a b 2 + 16 n , 16 n b 2 16 n + b 2 ) = N ( n 16 X ˉ + 1 b 2 a n 16 + 1 b 2 , 1 n 16 + 1 b 2 ) . \mu |\bar{X}\sim N\left( \frac{b^2\bar{X}+\frac{16}{n}a}{b^2+\frac{16}{n}},\frac{\frac{16}{n}b^2}{\frac{16}{n}+b^2} \right) =N\left( \frac{\frac{n}{16}\bar{X}+\frac{1}{b^2}a}{\frac{n}{16}+\frac{1}{b^2}},\frac{1}{\frac{n}{16}+\frac{1}{b^2}} \right) . μ∣Xˉ∼N(b2+n16b2Xˉ+n16a,n16+b2n16b2)=N(16n+b2116nXˉ+b21a,16n+b211). It can be seen that the posterior mean is the weighted average of sample information and prior information .
11、 ... and 、(10 branch ) set up X 1 , ⋯ , X n X_1,\cdots,X_n X1,⋯,Xn yes i.i.d. Of U ( 0 , θ ) U(0,\theta) U(0,θ) A random sample , Consider the hypothesis test problem
H 0 : θ ≤ 1 v s H 1 : θ > 1 H_0:\theta \le 1 \quad \mathrm{vs} \quad H_1: \theta >1 H0:θ≤1vsH1:θ>1 Construct reject fields W = { X ( n ) ≥ c } W=\{X_{(n)}\ge c \} W={ X(n)≥c}. Answer the following questions :
(1)(5 branch ) α = 0.05 \alpha = 0.05 α=0.05, seek c c c;
(2)(5 branch ) When θ = 1.5 \theta=1.5 θ=1.5, In order to make the probability of making the second kind of mistake β ≤ 0.1 \beta\le 0.1 β≤0.1, Find the minimum sample size .
Solution:
(1) In order to make the significance level 0.05 0.05 0.05, Yes
0.05 = s u p θ ≤ 1 P θ ( X ( n ) ≥ c ) = P θ = 1 ( X ( n ) ≥ c ) = ( 1 − c ) n , 0.05 = \underset{\theta \le 1}{\mathrm{sup}}P_{\theta}\left( X_{\left( n \right)}\ge c \right) =P_{\theta =1}\left( X_{\left( n \right)}\ge c \right) =\left( 1-c \right) ^n, 0.05=θ≤1supPθ(X(n)≥c)=Pθ=1(X(n)≥c)=(1−c)n,
Solution c = 0.9 5 1 n c=0.95^{\frac{1}{n}} c=0.95n1.
(2) The probability of making the second kind of mistake is
β ( 1.5 ) = P θ = 1.5 ( X ( n ) < 0.9 5 1 n ) = ( 0.9 5 1 n 1.5 ) n = 0.95 1. 5 n , \beta \left( 1.5 \right) =P_{\theta =1.5}\left( X_{\left( n \right)}<0.95^{\frac{1}{n}} \right) =\left( \frac{0.95^{\frac{1}{n}}}{1.5} \right) ^n=\frac{0.95}{1.5^n}, β(1.5)=Pθ=1.5(X(n)<0.95n1)=(1.50.95n1)n=1.5n0.95, Make it less than or equal to 0.1 0.1 0.1, have to
0.95 1. 5 n ≤ 0.1 * n ≥ ln 9.5 ln 1.5 * n ≥ 6. \frac{0.95}{1.5^n}\le 0.1 \Longrightarrow \,\,n\ge \frac{\ln 9.5}{\ln 1.5}\,\,\Longrightarrow \,\,n\ge 6. 1.5n0.95≤0.1*n≥ln1.5ln9.5*n≥6.
边栏推荐
- The latest tank battle 2022 - Notes on the whole development -2
- TYUT太原理工大学2022数据库大题之E-R图转关系模式
- Caching mechanism of leveldb
- 受检异常和非受检异常的区别和理解
- 学编程的八大电脑操作,总有一款你不会
- Relational algebra of tyut Taiyuan University of technology 2022 database
- [while your roommate plays games, let's see a problem]
- TYUT太原理工大学2022数据库大题之分解关系模式
- (ultra detailed onenet TCP protocol access) arduino+esp8266-01s access to the Internet of things platform, upload real-time data collection /tcp transparent transmission (and how to obtain and write L
- Arduino+ds18b20 temperature sensor (buzzer alarm) +lcd1602 display (IIC drive)
猜你喜欢

Questions and answers of "Fundamentals of RF circuits" in the first semester of the 22nd academic year of Xi'an University of Electronic Science and technology

Introduction and use of redis

Decomposition relation model of the 2022 database of tyut Taiyuan University of Technology

5. Function recursion exercise

Questions and answers of "signal and system" in the first semester of the 22nd academic year of Xi'an University of Electronic Science and technology

5. Download and use of MSDN

Small exercise of library management system

Change vs theme and set background picture

这次,彻底搞清楚MySQL索引

Conceptual model design of the 2022 database of tyut Taiyuan University of Technology
随机推荐
Cookie和Session的区别
5月14日杂谈
(原创)制作一个采用 LCD1602 显示的电子钟,在 LCD 上显示当前的时间。显示格式为“时时:分分:秒秒”。设有 4 个功能键k1~k4,功能如下:(1)k1——进入时间修改。
最新坦克大战2022-全程开发笔记-2
(super detailed II) detailed visualization of onenet data, how to plot with intercepted data flow
Alibaba cloud microservices (III) sentinel open source flow control fuse degradation component
The overseas sales of Xiaomi mobile phones are nearly 140million, which may explain why Xiaomi ov doesn't need Hongmeng
Network layer 7 protocol
【毕业季·进击的技术er】再见了,我的学生时代
Voir ui plus version 1.3.1 pour améliorer l'expérience Typescript
System design learning (III) design Amazon's sales rank by category feature
View UI plus released version 1.2.0 and added image, skeleton and typography components
4.二分查找
关于双亲委派机制和类加载的过程
[the Nine Yang Manual] 2017 Fudan University Applied Statistics real problem + analysis
Cloud native trend in 2022
3.C语言用代数余子式计算行列式
最新坦克大战2022-全程开发笔记-3
IPv6 experiment
魏牌:产品叫好声一片,但为何销量还是受挫