[Probability Theory_14] Week14
서울대학교 통계학과 대학원 2021년도 1학년 1학기 확률론1 수업의 정리내용입니다.
reference: Lecture note (Prof.sangSangyeol Lee),
Probability: Theory and Examples, Rick Durrett, Version 5 January 11, 2019
$\textbf{Theorem}$
Let $\lbrace \mu_n : n \geq 1 \rbrace$ be a seq of prob measure ans let $\varphi_n(t) = \int e^{itx} d\mu_n(x)$.
If $\varphi_n (t) \rightarrow \varphi(t), \forall t, n \rightarrow \infty$ and $\varphi$ is conti at $t=0$, $\Longrightarrow \varphi$ is Ch.f and $\exists$prob measure $\mu$ s.t. $\mu_n \xrightarrow{w} \mu$ and $\varphi(t) = \int e^{itx}d\mu(x)$
$\textbf{Lemma}$
Let $\mu$ be a prob measure and $\varphi(t) = \int e^{itx} d\mu(x)$. Then
$\textbf{Theorem}$
Assume $X,Y$ are indep and $X=Y$ with mean 0 and variance 1. If $X+Y, X-Y$ are indep, then $X,Y$ are normal r.v.s
Central Limit Theorem
Let $S_n = X_1 + \cdots + X_n, X_i \sim iid ~~ (0, \sigma^2)$
$\textbf{Definition[Lindeberg’s condition of CLT]}$
Let $\lbrace X_{nk} : k=1, \cdots , r_n \rbrace, r_n \in \mathbb{Z}^+ $ and $r_n \nearrow \infty$ be a double array of r.v.s s.t.
- $X_{n1} , \cdots , X_{nr_n}$ indep
- $EX_{nk} = 0 , \forall nk$
- $EX^2_{nk} < \infty$
Then $\lbrace X_{nk} \rbrace $ satisfies Lindeberg’s condition, if
$S_n^2 = \sigma_{n1}^2 + \cdots + \sigma_{nR_n}^2, \sigma_{nk}^2 = EX_{nk}^2$
$\textbf{Theorem}$
Set $S_n = X_{n1} + \cdots + X_{nr_n}$, Lindeberg’s conditions.
$s_n^2 = \sigma_{n1}^2 + \cdots + \sigma_{nr_n}^2$
$\textbf{Definition[Lyapounov’s condition]}$
$\lbrace X_{nk} \rbrace $ stisfies Lyapounov’s condition, if for some $\delta > 0$
- $EX_{nk} = 0$
- $E\vert X_{nk}\vert^{2+\delta} < \infty$
- $\lim_{n \rightarrow \infty} \Sigma_{k=1}^{r_n} \vert X_{nk} \vert ^{2+ \delta} / s_n^{2+\delta} = 0$
- Lyapounov’s condi $\rightarrow$ Lindeberg’s condi
- Lyapounov’s condi $\nleftarrow$ Lindeberg’s condi
$\textbf{Theorem[Felle’s theorem]}$
Lindeberg’s condition holds $\Leftrightarrow $ CLT holds and $\dfrac{\max_{1\leq k \leq r_n} \sigma_{nk}^2 }{s_n^2} \rightarrow 0$
$\textbf{Theorem}$
For $n \geq 1 , X_{nm}, m=1, \cdots , n $ are indep r.v.s with $P(X_{nm}=1) = P_{nm} = 1 - P(X_{nm} = 0 )$. Assume
- $\Sigma^n_{m=1} P_{nm} \rightarrow \lambda \in (0,\infty)$
- $\max_{1 \leq m \leq n} P_{nm} \rightarrow 0$
Then $S_n = X_{n1} + \cdots + X_{nn} \xrightarrow{d} Poisson(\lambda)$
$\textbf{Theorem}$
Let $X_{nm}, m=1, \cdots , n $ be indep r.v.s taking values $0,1,2, \cdots $ with $P(X_{nm}=1) = P_{nm}$, $P(X_{nm}\geq 2) = \varepsilon _{nm}$ s.t.
- $\Sigma^n_{m=1} P_{nm} \rightarrow 0$
- $\max_{1 \leq m \leq n} P_{nm} \rightarrow 0$
Then $S_n \xrightarrow{d} Poisson(\lambda)$