BS2 Statistical Inference I, MT04

Overview of Lectures

For Michaelmas term the lectures will be concerned with frequentist inference, the term referring to the fact that uncertainty statements and judgements are based on a long-run frequency interpretation of the sampling distribution of the observations.

There will be some unavoidable overlap between this course and previous statistics courses, but from a slightly different and more general perspective.

The preliminary content of each of the lectures is as follows:

  1. Inference
  2. Properties of estimators
  3. Unbiased estimators and sufficiency
  4. Methods of estimation; ancillarity
  5. Exponential families of distributions
  6. Maximum likelihood in exponential families
  7. Asymptotic properties of the MLE
  8. More on asymptotics and computation of the MLE
  9. The method of scoring and the EM algorithm
  10. The EM algorithm - Example
  11. Hypothesis testing
  12. Locally most powerful and large sample tests
  13. The maximized likelihood ratio test
  14. The sequential probability ratio test
  15. More on the sequential probability ratio test
  16. Course overview

Link to classes

The synopsis recommends reading

K. Knight, Mathematical Statistics, Chapman and Hall/CRC, 1999.

P.Garthwaite, I. Joliffe and B. Jones, Statistical Inference, OUP, 2002.

and so do I. However, in addition I suggest the lovely little book

S. D. Silvey, Statistical Inference, Chapman and Hall, 1975.

For more advanced and comprehensive material, consult e.g.

C. R. Rao, Linear Statistical Inference and its Applications. 2nd ed. Wiley (1973).

D. R. Cox and D. V. Hinkley, Theoretical Statistics. Chapman and Hall, 1974.

Course page


Last updated: Friday, 03 December 2004 11:39Steffen L. Lauritzen