Unit 1: Sufficiency principle, factorization theorem, minimal sufficiency, minimal sufficient
partition, minimal sufficient statistics, minimal sufficient statistic for exponential family,
power series family, curved exponential family, and Pitman family, completeness, bounded
completeness, ancillary statistics, Basu’s theorem and its applications.
(12L + 3T)
Unit 2: Problem of point estimation, unbiased estimators, minimum variance unbiased
estimator, Rao-Blackwell theorem and Lehmann-Scheffe theorem and their applications. A
necessary and sufficient condition for an estimator to be UMVUE, Fisher information and
information matrix, Cramer-Rao inequality, Chapman-Robbins-Kiefer bound, Bhattacharya
bounds, their applications.
(12L + 3T)
Unit 3: Maximum likelihood estimator (MLE), properties of MLE, MLE in nonregular
families, method of scoring and its applications, method of moments, method of minimum
chi-square, U-statistics for expectation and variance; it’s simple properties.
(12L + 3T)
Unit 4: The concepts of prior and posterior distributions, conjugate, Jeffrey’s and improper
priors with examples, Bayes estimation under squared error and absolute error loss functions.
(12L + 3T)
References
1. Rohatgi, V.K. and Saleh, A. K. MD. E. (2015). Introduction to Probability Theory and
Mathematical Statistics -3rd edition, John Wiley & sons.
2. Lehmann, E. L. (1983). Theory of Point Estimation - John Wiley & sons.
3. Rao, C. R.(1973). Linear Statistical Inference and its Applications, 2nd edition, Wiley.
4. Kale, B.K. and Muralidharan, K. (2015). Parametric Inference: An Introduction, Alpha
Science International Ltd.
5. Mukhopadhyay, P. (2015). Mathematical Statistics, Books and Allied (p) Ltd.
6. Dudewicz, E. J. andMishra,S. N. (1988). Modern Mathematical Statistics, John Wiley
and Sons.
7. Casella, G., and Berger, R. L. (2001). Statistical Inference, 2nd edition, Duxbury press
- Teacher: S.B. Mahadik