Bayes Estimation for the Variance of a Normal Distribution
7
Bayesian Estimation - General Linear Model
8
Sufficient Statistic
9
Fisher-Neyman Factorization Theorem
10
One-to-One Functions of Sufficient Statistics
11
Sufficient Statistics - Examples
12
Jointly Sufficient Statistics - Examples
13
Distribution of a sufficient statistics from a 1-parameter exponential family
14
Minimal Sufficient Statistics
15
Minimally Sufficient Statistic and Maximum Likelihood Estimation
16
Ancillary Statistic
17
Ancillary Statistic: Example
18
Complete Statistics
19
Basu's Theorem
20
Basu's Theorem: Examples
21
Unbiased Estimate and Mean Squared Error
22
Unbiased Estimates for Population Std Dev using the Sample Mean Absolute Dev and the Sample Std Dev
23
Normal Unbiased Estimator implies the Mean Absolute & square-root Mean Squared Loss are Proportional
24
Rao - Blackwell Theorem
25
Lehmann - Scheffe Theorem
26
Fisher's Information: Examples
27
Fisher's Information: Cauchy Distribution
28
Cramer-Rao Lower Bound / Inequality
29
Exponential Family: Cramer-Rao Lower Bound
Description:
Explore a comprehensive playlist on parameter estimation techniques, covering empirical substitution, maximum likelihood estimation, Bayesian methods, sufficient statistics, and optimality properties. Learn about Fisher-Neyman factorization, ancillary statistics, complete statistics, unbiased estimates, and the Cramer-Rao lower bound. Dive into specific examples and theorems, including Basu's theorem, Rao-Blackwell theorem, and Lehmann-Scheffe theorem, to gain a thorough understanding of point estimation in statistics.