Content
Lecture 1 Intro
- LLN (Law of large numbers)
- CLT (Central Limit Theorem)
- CMT (Continuous Mapping Theorem)
- ST (Slutsky’s Theorem)
- Delta Method
Lecture 2 Estimation
Estimation:
- unbiased
- consistent
- Accuracy of an estimator: M S E = V a r ( θ ^ ) + b i a s ( θ , θ ^ ) = V a r ( θ ^ ) + [ E ( θ ^ ) − θ ) ] 2 MSE = Var(\hat{\theta}) + bias(\theta, \hat{\theta}) = Var(\hat{\theta}) + [E(\hat{\theta}) - \theta)]^2 MSE=Var(θ^)+bias(θ,θ^)=Var(θ^)+[E(θ^)−θ)]2
- Relative efficiency
Two estimation methods
Method of Moments
- Theorem
Maximum Likelihood Estimator
- Fisher information
- Theorem (consistent, unbiased)
Optimality in estimation
- Cramer-Rao lower bound