当前位置:网站首页>[the Nine Yang Manual] 2018 Fudan University Applied Statistics real problem + analysis

[the Nine Yang Manual] 2018 Fudan University Applied Statistics real problem + analysis

2022-07-06 13:30:00 Elder martial brother statistics

The real part

One 、(20 branch ) from 1-10 Don't put it back 3 A digital , Find the following probability
(1)(5 branch ) The minimum number is 5;
(2)(5 branch ) The biggest number is 5;
(3)(5 branch ) At least one is less than 6;
(4)(5 branch ) One less than 5, One is equal to 5, One is greater than 5.


Two 、(15 branch ) You are trying again. The probability of success is p p p Events , Don't stop until there are two consecutive successes or two failures , Please take the probability of two successful stops .


3、 ... and 、(15 branch ) Find binomial distribution , ( a , b ) (a,b) (a,b) Evenly distributed , Expectation and variance of gamma distribution .


Four 、(20 branch ) prove E ( X 2 ) < ∞ E\left(X^{2}\right)<\infty E(X2)< The necessary and sufficient condition of is the series ∑ n P ( ∣ X ∣ > n ) \sum n P(|X|>n) nP(X>n) convergence .


5、 ... and 、(20 branch ) X 1 , X 2 , X 3 X_{1}, X_{2}, X_{3} X1,X2,X3 Is taken from the expectation that α \alpha α Random samples of exponential distribution , Find the probability P ( X 1 < X 2 < X 3 ) P\left(X_{1}<X_{2}<X_{3}\right) P(X1<X2<X3) as well as X ( 1 ) X_{(1)} X(1) Probability density of .


6、 ... and 、(20 branch ) P ( X i = − 0.3 ) = P ( X i = 0.4 ) = 1 2 , i = 1 , 2 , … , n , P\left(X_{i}=-0.3\right)=P\left(X_{i}=0.4\right)=\frac{1}{2}, i=1,2, \ldots, n, P(Xi=0.3)=P(Xi=0.4)=21,i=1,2,,n, Are independent of each other , Construct a sequence of random variables Y n = ∏ i = 1 n ( X i + 1 ) , Y_{n}=\prod_{i=1}^{n}\left(X_{i}+1\right), Yn=i=1n(Xi+1), seek Y n Y_{n} Yn And prove Y n Y_{n} Yn Expectations tend to be infinite .


7、 ... and 、(20 branch ) There is a pile of balls : 2 red , 3 black , 4 white . Touch a ball randomly , If it's black, remember that you win , If it's other colors , Some people put it back and continue to touch the ball , Until the color or black appears repeatedly , If there is the color you touch for the first time , Then you win , Otherwise you lose . Please ask the probability of winning .


8、 ... and 、(20 branch )
(1)(10 branch ) Explain consistency estimates ;
(2)(10 branch ) X 1 , … , X n X_{1}, \ldots, X_{n} X1,,Xn It is a sample from the same population , Write a consistent estimate of the median , And explain why .


The analysis part

One 、(20 branch ) from 1-10 Don't put it back 3 A digital , Find the following probability
(1)(5 branch ) The minimum number is 5;
(2)(5 branch ) The biggest number is 5;
(3)(5 branch ) At least one is less than 6;
(4)(5 branch ) One less than 5, One is equal to 5, One is greater than 5.

Solution:

(1) # Ω = C 10 3 = 120 \# \Omega=C_{10}^{3}=120 #Ω=C103=120, #A A 1 = 1 ⋅ C 5 2 = 10 , P ( A 1 ) = # A 1 # Ω = 1 12 A_{1}=1 \cdot C_{5}^{2}=10, P\left(A_{1}\right)=\frac{\# A_{1}}{\# \Omega}=\frac{1}{12} A1=1C52=10,P(A1)=#Ω#A1=121.

(2) # A 2 = 1 ⋅ C 4 2 = 6 , P ( A 2 ) = # A 2 # Ω = 1 20 \# A_{2}=1 \cdot C_{4}^{2}=6, P\left(A_{2}\right)=\frac{\# A_{2}}{\# \Omega}=\frac{1}{20} #A2=1C42=6,P(A2)=#Ω#A2=201.

(3) # A 3 ‾ = C 5 3 = 10 , P ( A 3 ) = 1 − # A 3 ‾ # Ω = 11 12 \# \overline{A_{3}}=C_{5}^{3}=10, P\left(A_{3}\right)=1-\frac{\# \overline{A_{3}}}{\# \Omega}=\frac{11}{12} #A3=C53=10,P(A3)=1#Ω#A3=1211.

(4) # A 4 = 4 ⋅ 1 ⋅ 5 = 20 , P ( A 4 ) = # A 4 # Ω = 1 6 \# A_{4}=4 \cdot 1 \cdot 5=20, P\left(A_{4}\right)=\frac{\# A_{4}}{\# \Omega}=\frac{1}{6} #A4=415=20,P(A4)=#Ω#A4=61.

Two 、(15 branch ) You are trying again. The probability of success is p p p Events , Don't stop until there are two consecutive successes or two failures , Please With the probability of two successful stops .

Solution:

set up A A A “ Stop successfully twice ”", p 0 = P ( A ) , p 1 = P ( A ∣ p_{0}=P(A), p_{1}=P(A \mid p0=P(A),p1=P(A For the first time ) , p − 1 = P ), p_{-1}=P ),p1=P ( A ∣ A \mid A The first failure ) ) ), According to the full probability formula : { p 0 = p 1 ⋅ p + p − 1 ⋅ ( 1 − p ) p 1 = p + p − 1 ⋅ ( 1 − p ) p − 1 = p 1 ⋅ p \left\{\begin{array}{l} p_{0}=p_{1} \cdot p+p_{-1} \cdot(1-p) \\ p_{1}=p+p_{-1} \cdot(1-p) \\ p_{-1}=p_{1} \cdot p \end{array}\right. p0=p1p+p1(1p)p1=p+p1(1p)p1=p1p Solution p 0 = p 2 ( 2 − p ) 1 − p ( 1 − p ) p_{0}=\frac{p^{2}(2-p)}{1-p(1-p)} p0=1p(1p)p2(2p).

3、 ... and 、(15 branch ) Find binomial distribution , ( a , b ) (a,b) (a,b) Evenly distributed , Expectation and variance of gamma distribution .

Solution:

(1) The binomial distribution : X ∼ B ( n , p ) , P ( X = k ) = C n k p k ( 1 − p ) n − k , k = 0 , 1 , … , n X \sim B(n, p), P(X=k)=C_{n}^{k} p^{k}(1-p)^{n-k}, k=0,1, \ldots, n XB(n,p),P(X=k)=Cnkpk(1p)nk,k=0,1,,n,
E X = ∑ k = 0 n k n ! k ! ( n − k ) ! p k ( 1 − p ) n − k = n p ∑ k = 1 n ( n − 1 ) ! ( k − 1 ) ! ( n − k ) ! p k − 1 ( 1 − p ) n − k = n p , E X=\sum_{k=0}^{n} k \frac{n !}{k !(n-k) !} p^{k}(1-p)^{n-k}=n p \sum_{k=1}^{n} \frac{(n-1) !}{(k-1) !(n-k) !} p^{k-1}(1-p)^{n-k}=n p, EX=k=0nkk!(nk)!n!pk(1p)nk=npk=1n(k1)!(nk)!(n1)!pk1(1p)nk=np,
E X ( X − 1 ) = ∑ k = 0 n k ( k − 1 ) C n k p k ( 1 − p ) n − k = n ( n − 1 ) p 2 ∑ k = 2 n C n − 2 k − 2 p k − 2 ( 1 − p ) n − k = n ( n − 1 ) p 2 , E X(X-1)=\sum_{k=0}^{n} k(k-1) C_{n}^{k} p^{k}(1-p)^{n-k}=n(n-1) p^{2} \sum_{k=2}^{n} C_{n-2}^{k-2} p^{k-2}(1-p)^{n-k}=n(n-1) p^{2}, EX(X1)=k=0nk(k1)Cnkpk(1p)nk=n(n1)p2k=2nCn2k2pk2(1p)nk=n(n1)p2,
therefore E X 2 = n ( n − 1 ) p 2 + n p , D X = E X 2 − ( E X ) 2 = n ( n − 1 ) p 2 + n p − n 2 p 2 = n p ( 1 − p ) E X^{2}=n(n-1) p^{2}+n p, D X=E X^{2}-(E X)^{2}=n(n-1) p^{2}+n p-n^{2} p^{2}=n p(1-p) EX2=n(n1)p2+np,DX=EX2(EX)2=n(n1)p2+npn2p2=np(1p).

(2) Uniform distribution : X ∼ U ( a , b ) , f ( x ) = 1 b − a , a < x < b X \sim U(a, b), f(x)=\frac{1}{b-a}, a<x<b XU(a,b),f(x)=ba1,a<x<b,
E X = ∫ a b x b − a d x = a + b 2 , D X = ∫ a b ( x − a + b 2 ) 2 1 b − a d x = ( b − a ) 2 12 . E X=\int_{a}^{b} \frac{x}{b-a} d x=\frac{a+b}{2}, D X=\int_{a}^{b}\left(x-\frac{a+b}{2}\right)^{2} \frac{1}{b-a} d x=\frac{(b-a)^{2}}{12} . EX=abbaxdx=2a+b,DX=ab(x2a+b)2ba1dx=12(ba)2.
(3) Gamma distribution : X ∼ G a ( α , λ ) , f ( x ) = λ α Γ ( α ) x α − 1 e − λ x , x > 0 X \sim G a(\alpha, \lambda), f(x)=\frac{\lambda^{\alpha}}{\Gamma(\alpha)} x^{\alpha-1} e^{-\lambda x}, x>0 XGa(α,λ),f(x)=Γ(α)λαxα1eλx,x>0,
E X = ∫ 0 + ∞ λ α Γ ( α ) x α e − λ x d x = 1 λ Γ ( α ) ∫ 0 + ∞ ( λ x ) α e − λ x d ( λ x ) = Γ ( α + 1 ) λ Γ ( α ) = α λ , E X=\int_{0}^{+\infty} \frac{\lambda^{\alpha}}{\Gamma(\alpha)} x^{\alpha} e^{-\lambda x} d x=\frac{1}{\lambda \Gamma(\alpha)} \int_{0}^{+\infty}(\lambda x)^{\alpha} e^{-\lambda x} d(\lambda x)=\frac{\Gamma(\alpha+1)}{\lambda \Gamma(\alpha)}=\frac{\alpha}{\lambda}, EX=0+Γ(α)λαxαeλxdx=λΓ(α)10+(λx)αeλxd(λx)=λΓ(α)Γ(α+1)=λα,
E X 2 = ∫ 0 + ∞ λ α Γ ( α ) x α + 1 e − λ x d x = 1 λ 2 Γ ( α ) ∫ 0 + ∞ ( λ x ) α + 1 e − λ x d ( λ x ) = Γ ( α + 2 ) λ 2 Γ ( α ) = α ( α + 1 ) λ E X^{2}=\int_{0}^{+\infty} \frac{\lambda^{\alpha}}{\Gamma(\alpha)} x^{\alpha+1} e^{-\lambda x} d x=\frac{1}{\lambda^{2} \Gamma(\alpha)} \int_{0}^{+\infty}(\lambda x)^{\alpha+1} e^{-\lambda x} d(\lambda x)=\frac{\Gamma(\alpha+2)}{\lambda^{2} \Gamma(\alpha)}=\frac{\alpha(\alpha+1)}{\lambda} EX2=0+Γ(α)λαxα+1eλxdx=λ2Γ(α)10+(λx)α+1eλxd(λx)=λ2Γ(α)Γ(α+2)=λα(α+1),
so D X = E X 2 − ( E X ) 2 = α ( α + 1 ) λ 2 − α 2 λ 2 = α λ 2 . D X=E X^{2}-(E X)^{2}=\frac{\alpha(\alpha+1)}{\lambda^{2}}-\frac{\alpha^{2}}{\lambda^{2}}=\frac{\alpha}{\lambda^{2}} . DX=EX2(EX)2=λ2α(α+1)λ2α2=λ2α.

Four 、(20 branch ) prove E ( X 2 ) < ∞ E\left(X^{2}\right)<\infty E(X2)< The necessary and sufficient condition of is the series ∑ n P ( ∣ X ∣ > n ) \sum n P(|X|>n) nP(X>n) convergence .

Solution:
(1) First explain E ∣ X ∣ < + ∞ E|X|<+\infty EX<+ The necessary and sufficient condition of is the series ∑ n = 1 ∞ P ( ∣ X ∣ > n ) \sum_{n=1}^{\infty} P(|X|>n) n=1P(X>n) convergence : because ∑ n = 1 ∞ P ( ∣ X ∣ > n ) = ∑ n = 1 ∞ ∑ k = n ∞ P ( k < ∣ X ∣ ≤ k + 1 ) = ∑ k = 1 ∞ ∑ n = 1 k P ( k < ∣ X ∣ ≤ k + 1 ) = ∑ k = 1 ∞ k P ( k < ∣ X ∣ ≤ k + 1 ) \begin{aligned} \sum_{n=1}^{\infty} P(|X|>n) &=\sum_{n=1}^{\infty} \sum_{k=n}^{\infty} P(k<|X| \leq k+1) \\ &=\sum_{k=1}^{\infty} \sum_{n=1}^{k} P(k<|X| \leq k+1) \\ &=\sum_{k=1}^{\infty} k P(k<|X| \leq k+1) \end{aligned} n=1P(X>n)=n=1k=nP(k<Xk+1)=k=1n=1kP(k<Xk+1)=k=1kP(k<Xk+1) At the same time, by the comparison and discrimination of positive series , Superior number and ∑ k = 1 ∞ ( k + 1 ) P ( k < ∣ X ∣ ≤ k + 1 ) \sum_{k=1}^{\infty}(k+1) P(k<|X| \leq k+1) k=1(k+1)P(k<Xk+1) It is also convergent and scattered , in consideration of E ∣ X ∣ = ∫ 0 + ∞ x d F ∣ X ∣ ( x ) E|X|=\int_{0}^{+\infty} x d F_{|X|}(x) EX=0+xdFX(x), On the one hand ∫ 0 + ∞ x d F ∣ X ∣ ( x ) = ∑ k = 0 ∞ ∫ k k + 1 x d F ∣ X ∣ ( x ) ≤ ∑ k = 0 ∞ ∫ k k + 1 ( k + 1 ) d F ∣ X ∣ ( x ) = ∑ k = 0 ∞ ( k + 1 ) P ( k < ∣ X ∣ ≤ k + 1 ) , \begin{aligned} \int_{0}^{+\infty} x d F_{|X|}(x) &=\sum_{k=0}^{\infty} \int_{k}^{k+1} x d F_{|X|}(x) \\ & \leq \sum_{k=0}^{\infty} \int_{k}^{k+1}(k+1) d F_{|X|}(x) \\ &=\sum_{k=0}^{\infty}(k+1) P(k<|X| \leq k+1), \end{aligned} 0+xdFX(x)=k=0kk+1xdFX(x)k=0kk+1(k+1)dFX(x)=k=0(k+1)P(k<Xk+1), On the other hand, there are
∫ 0 + ∞ x d F ∣ X ∣ ( x ) = ∑ k = 0 ∞ ∫ k k + 1 x d F ∣ X ∣ ( x ) ≥ ∑ k = 0 ∞ ∫ k k + 1 k d F ∣ X ∣ ( x ) = ∑ k = 0 ∞ k P ( k < ∣ X ∣ ≤ k + 1 ) , \begin{aligned} \int_{0}^{+\infty} x d F_{|X|}(x) &=\sum_{k=0}^{\infty} \int_{k}^{k+1} x d F_{|X|}(x) \\ & \geq \sum_{k=0}^{\infty} \int_{k}^{k+1} k d F_{|X|}(x) \\ &=\sum_{k=0}^{\infty} k P(k<|X| \leq k+1), \end{aligned} 0+xdFX(x)=k=0kk+1xdFX(x)k=0kk+1kdFX(x)=k=0kP(k<Xk+1), in summary , E ∣ X ∣ < + ∞ E|X|<+\infty EX<+ The necessary and sufficient condition of is the series ∑ n = 1 ∞ P ( ∣ X ∣ > n ) \sum_{n=1}^{\infty} P(|X|>n) n=1P(X>n) convergence .

(2) Further explanation E X 2 < + ∞ E X^{2}<+\infty EX2<+ The necessary and sufficient condition of is the series ∑ n = 1 + ∞ n P ( ∣ X ∣ > n ) \sum_{n=1}^{+\infty} n P(|X|>n) n=1+nP(X>n) convergence : because ∑ n = 1 ∞ n P ( ∣ X ∣ > n ) = ∑ n = 1 ∞ ∑ k = n ∞ n P ( k < ∣ X ∣ ≤ k + 1 ) = ∑ k = 1 ∞ ∑ n = 1 ∞ n P ( k < ∣ X ∣ ≤ k + 1 ) = ∑ k = 1 ∞ k ( k + 1 ) 2 P ( k < ∣ X ∣ ≤ k + 1 ) \begin{aligned} \sum_{n=1}^{\infty} n P(|X|>n) &=\sum_{n=1}^{\infty} \sum_{k=n}^{\infty} n P(k<|X| \leq k+1) \\ &=\sum_{k=1}^{\infty} \sum_{n=1}^{\infty} n P(k<|X| \leq k+1) \\ &=\sum_{k=1}^{\infty} \frac{k(k+1)}{2} P(k<|X| \leq k+1) \end{aligned} n=1nP(X>n)=n=1k=nnP(k<Xk+1)=k=1n=1nP(k<Xk+1)=k=12k(k+1)P(k<Xk+1) At the same time, by the comparison and discrimination of positive series , The convergence and divergence of the above formula is obviously equivalent to ∑ n = 1 ∞ n 2 P ( n < ∣ X ∣ ≤ n + 1 ) \sum_{n=1}^{\infty} n^{2} P(n<|X| \leq n+1) n=1n2P(n<Xn+1) Convergence and divergence of , It's also equivalent to ∑ n = 1 ∞ ( n + 1 ) 2 P ( n < ∣ X ∣ ≤ n + 1 ) \sum_{n=1}^{\infty}(n+1)^{2} P(n<|X| \leq n+1) n=1(n+1)2P(n<Xn+1) Convergence and divergence of , Also with the help of second-order moment determination Semantic formula E X 2 = ∫ 0 + ∞ x 2 d F ∣ X ∣ ( x ) E X^{2}=\int_{0}^{+\infty} x^{2} d F_{|X|}(x) EX2=0+x2dFX(x), On the one hand ∫ 0 + ∞ x 2 d F ∣ X ∣ ( x ) = ∑ n = 0 ∞ ∫ n n + 1 x 2 d F ∣ X ∣ ( x ) ≤ ∑ n = 0 ∞ ∫ n n + 1 ( n + 1 ) 2 d F ∣ X ∣ ( x ) = ∑ n = 0 ∞ ( n + 1 ) 2 P ( n < ∣ X ∣ ≤ n + 1 ) \begin{aligned} \int_{0}^{+\infty} x^{2} d F_{|X|}(x) &=\sum_{n=0}^{\infty} \int_{n}^{n+1} x^{2} d F_{|X|}(x) \\ & \leq \sum_{n=0}^{\infty} \int_{n}^{n+1}(n+1)^{2} d F_{|X|}(x) \\ &=\sum_{n=0}^{\infty}(n+1)^{2} P(n<|X| \leq n+1) \end{aligned} 0+x2dFX(x)=n=0nn+1x2dFX(x)n=0nn+1(n+1)2dFX(x)=n=0(n+1)2P(n<Xn+1)
On the other hand, there are ∫ 0 + ∞ x 2 d F ∣ X ∣ ( x ) = ∑ n = 0 ∞ ∫ n n + 1 x 2 d F ∣ X ∣ ( x ) ≤ ∑ n = 0 ∞ ∫ n n + 1 n 2 d F ∣ X ∣ ( x ) = ∑ n = 0 ∞ n 2 P ( n < ∣ X ∣ ≤ n + 1 ) , \begin{aligned} \int_{0}^{+\infty} x^{2} d F_{|X|}(x) &=\sum_{n=0}^{\infty} \int_{n}^{n+1} x^{2} d F_{|X|}(x) \\ & \leq \sum_{n=0}^{\infty} \int_{n}^{n+1} n^{2} d F_{|X|}(x) \\ &=\sum_{n=0}^{\infty} n^{2} P(n<|X| \leq n+1), \end{aligned} 0+x2dFX(x)=n=0nn+1x2dFX(x)n=0nn+1n2dFX(x)=n=0n2P(n<Xn+1), in summary , E X 2 < + ∞ E X^{2}<+\infty EX2<+ The necessary and sufficient condition of is the series ∑ n = 1 + ∞ n P ( ∣ X ∣ > n ) \sum_{n=1}^{+\infty} n P(|X|>n) n=1+nP(X>n) convergence .

5、 ... and 、(20 branch ) X 1 , X 2 , X 3 X_{1}, X_{2}, X_{3} X1,X2,X3 Is taken from the expectation that α \alpha α Random samples of exponential distribution , Find the probability P ( X 1 < X 2 < X 3 ) P\left(X_{1}<X_{2}<X_{3}\right) P(X1<X2<X3) as well as X ( 1 ) X_{(1)} X(1) Probability density of .

Solution:
According to the rotation symmetry , P ( X 1 < X 2 < X 3 ) = 1 6 P\left(X_{1}<X_{2}<X_{3}\right)=\frac{1}{6} P(X1<X2<X3)=61. Make Y = X ( 1 ) Y=X_{(1)} Y=X(1), Then when y > 0 y>0 y>0 when , 1 − F ( y ) = P { Y > y } = P 3 { X 1 > y } = e − 3 α y 1-F(y)=P\{Y>y\}=P^{3}\left\{X_{1}>y\right\}=e^{-\frac{3}{\alpha} y} 1F(y)=P{ Y>y}=P3{ X1>y}=eα3y, so f ( y ) = 3 α e − 3 α y , y > 0 f(y)=\frac{3}{\alpha} e^{-\frac{3}{\alpha} y}, y>0 f(y)=α3eα3y,y>0. This happens to be Exp ⁡ ( 3 α ) \operatorname{Exp}\left(\frac{3}{\alpha}\right) Exp(α3).

6、 ... and 、(20 branch ) P ( X i = − 0.3 ) = P ( X i = 0.4 ) = 1 2 , i = 1 , 2 , … , n , P\left(X_{i}=-0.3\right)=P\left(X_{i}=0.4\right)=\frac{1}{2}, i=1,2, \ldots, n, P(Xi=0.3)=P(Xi=0.4)=21,i=1,2,,n, Are independent of each other , Construct a sequence of random variables Y n = ∏ i = 1 n ( X i + 1 ) , Y_{n}=\prod_{i=1}^{n}\left(X_{i}+1\right), Yn=i=1n(Xi+1), seek Y n Y_{n} Yn And prove Y n Y_{n} Yn Expectations tend to be infinite .

Solution:
By strong law of numbers , 1 n ln ⁡ Y n = 1 n ∑ i = 1 n ln ⁡ ( X i + 1 ) *  a.s.  E ln ⁡ ( X 1 + 1 ) = 1 2 ln ⁡ 0.98 < 0 \frac{1}{n} \ln Y_{n}=\frac{1}{n} \sum_{i=1}^{n} \ln \left(X_{i}+1\right) \stackrel{\text { a.s. }}{\longrightarrow} E \ln \left(X_{1}+1\right)=\frac{1}{2} \ln 0.98<0 n1lnYn=n1i=1nln(Xi+1)* a.s. Eln(X1+1)=21ln0.98<0, so ln ⁡ Y n *  a.s.  − ∞ , Y n *  a.s.  0 \ln Y_{n} \stackrel{\text { a.s. }}{\longrightarrow}-\infty, Y_{n} \stackrel{\text { a.s. }}{\longrightarrow} 0 lnYn* a.s. ,Yn* a.s. 0, therefore Y n Y_{n} Yn The limit of is the single point distribution , With probability 1 take 0 . and E Y n = ∏ i = 1 n E ( X i + 1 ) = ∏ i = 1 n ( 0.7 + 1.4 2 ) = 1.0 5 n → + ∞ . E Y_{n}=\prod_{i=1}^{n} E\left(X_{i}+1\right)=\prod_{i=1}^{n}\left(\frac{0.7+1.4}{2}\right)=1.05^{n} \rightarrow+\infty. EYn=i=1nE(Xi+1)=i=1n(20.7+1.4)=1.05n+.

7、 ... and 、(20 branch ) There is a pile of balls : 2 red , 3 black , 4 white . Touch a ball randomly , If it's black, remember that you win , If it's other colors , Some people put it back and continue to touch the ball , Until the color or black appears repeatedly , If there is the color you touch for the first time , Then you win , Otherwise you lose . Please ask the probability of winning .

Solution:
set up A k A_k Ak For the first time k k k The probability of winning the first time , The result to be found is P ( A ) = P ( ⋃ k = 1 ∞ A k ) = ∑ k = 1 ∞ P ( A k ) P(A)=P(\bigcup_{k=1}^{\infty}A_k)=\sum_{k=1}^{\infty}P(A_k) P(A)=P(k=1Ak)=k=1P(Ak).
(i) The probability of winning for the first time P ( A 1 ) = 1 3 P(A_1)=\frac{1}{3} P(A1)=31.
(ii) The first k k k Second win ( k > 1 k>1 k>1) indicate : I didn't touch the black ball for the first time , Follow up 2 , 3 , . . . , k − 1 2,3,...,k-1 2,3,...,k1 I didn't touch black for the first time, and I didn't touch the color you touched for the first time , Last k k k I touched the original color for the first time , This probability must be related to the color you first touch , Consider the full probability formula P ( A k ) = P ( R ) P ( A k ∣ R ) + P ( W ) P ( A k ∣ W ) P(A_k)=P(R)P(A_k|R)+P(W)P(A_k|W) P(Ak)=P(R)P(AkR)+P(W)P(AkW), among R R R It means touching red , W W W It means feeling white , Yes
P ( A k ∣ R ) = ( 4 9 ) k − 2 2 9 , P ( A k ∣ W ) = ( 2 9 ) k − 2 4 9 . P\left( A_k|R \right) =\left( \frac{4}{9} \right) ^{k-2}\frac{2}{9},\quad P\left( A_k\mid W \right) =\left( \frac{2}{9} \right) ^{k-2}\frac{4}{9}. P(AkR)=(94)k292,P(AkW)=(92)k294. therefore P ( A k ) = P ( R ) 2 9 ( 4 9 ) k − 2 + P ( W ) 4 9 ( 2 9 ) k − 2 = 4 81 ( 4 9 ) k − 2 + 16 81 ( 2 9 ) k − 2 . P\left( A_k \right) =P\left( R \right) \frac{2}{9}\left( \frac{4}{9} \right) ^{k-2}+P\left( W \right) \frac{4}{9}\left( \frac{2}{9} \right) ^{k-2}=\frac{4}{81}\left( \frac{4}{9} \right) ^{k-2}+\frac{16}{81}\left( \frac{2}{9} \right) ^{k-2}. P(Ak)=P(R)92(94)k2+P(W)94(92)k2=814(94)k2+8116(92)k2. In sum , Yes
P ( A ) = P ( A 1 ) + 4 81 ( 1 − 4 9 ) + 16 81 ( 1 − 2 9 ) = 1 3 + 4 45 + 16 63 = 71 105 . P\left( A \right) =P\left( A_1 \right) +\frac{4}{81\left( 1-\frac{4}{9} \right)}+\frac{16}{81\left( 1-\frac{2}{9} \right)}=\frac{1}{3}+\frac{4}{45}+\frac{16}{63}=\frac{71}{105}. P(A)=P(A1)+81(194)4+81(192)16=31+454+6316=10571.

8、 ... and 、(20 branch )
(1)(10 branch ) Explain consistency estimates ;
(2)(10 branch ) X 1 , … , X n X_{1}, \ldots, X_{n} X1,,Xn It is a sample from the same population , Write a consistent estimate of the median , And explain why .

Solution:
(1) g ^ \hat{g} g^ yes g g g The consistent estimate of means g ^ → p g \hat{g} \stackrel{p}{\rightarrow} g g^pg, This illustrates the g ^ \hat{g} g^ Is a good estimate , At least in the sample size n n n large , It deviates g g g The probability is very small . Further more , if g ^ \hat{g} g^ yes g g g The strong consistency estimate of means g ^ *  a.s.  g \hat{g} \stackrel{\text { a.s. }}{\longrightarrow} g g^* a.s. g.

(2) Sample median X [ n 2 ] X_{\left[\frac{n}{2}\right]} X[2n] Is the overall median x 0.5 x_{0.5} x0.5 The consistent estimator of , Let the overall density function be f ( x ) f(x) f(x), Asymptotically normal distribution with sample median X [ n 2 ] ∼ N ( x 0.5 , 1 4 n f 2 ( x 0.5 ) ) , X_{\left[\frac{n}{2}\right]} \sim N\left(x_{0.5}, \frac{1}{4 n f^{2}\left(x_{0.5}\right)}\right), X[2n]N(x0.5,4nf2(x0.5)1), From its asymptotic normal distribution, we can get P ( ∣ X [ n 2 ] − x 0.5 ∣ < ε ) = P ( 2 n f ( x 0.5 ) ∣ X [ n 2 ] − x 0.5 ∣ < 2 n f ( x 0.5 ) ε ) ∼ Φ ( 2 n f ( x 0.5 ) ε ) − Φ ( − 2 n f ( x 0.5 ) ε ) → 1. P\left( \left| X_{\left[ \frac{n}{2} \right]}-x_{0.5} \right|<\varepsilon \right) =P\left( 2\sqrt{n}f\left( x_{0.5} \right) \left| X_{\left[ \frac{n}{2} \right]}-x_{0.5} \right|<2\sqrt{n}f\left( x_{0.5} \right) \varepsilon \right) \sim \Phi \left( 2\sqrt{n}f\left( x_{0.5} \right) \varepsilon \right) -\Phi \left( -2\sqrt{n}f\left( x_{0.5} \right) \varepsilon \right) \rightarrow 1. P(X[2n]x0.5<ε)=P(2nf(x0.5)X[2n]x0.5<2nf(x0.5)ε)Φ(2nf(x0.5)ε)Φ(2nf(x0.5)ε)1. The above formula shows the median of the sample X [ n 2 ] X_{\left[\frac{n}{2}\right]} X[2n] Is the overall median x 0.5 x_{0.5} x0.5 The consistent estimator of .

原网站

版权声明
本文为[Elder martial brother statistics]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/187/202207060916368571.html