LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034
B.Sc. DEGREE EXAMINATION – STATISTICS
FIFTH SEMESTER – APRIL 2012
ST 5504/ST 5500 – ESTIMATION THEORY
Date : 25-04-2012 Dept. No. Max. : 100 Marks
Time : 9:00 – 12:00
PART-A
Answer ALL questions: (10×2=20 marks)
- Define consistent estimator.
- State the characteristics of a good estimator.
- Define sufficient statistic.
- State factorization theorem.
- Mention any two properties of MLE.
- Explain the concept of method of moment estimation.
- Define prior and posterior probability distribution.
- State Gauss-Markoff linear model.
- Define BLUE.
- Write down the normal equation of a simple linear regression model.
PART-B
Answer any FIVE questions: (5×8=40 marks)
11) Show that the sample variance is consistent estimator for the population variance
of a normal distribution.
12) Define Completeness with an example. Also, give an example of a family which
is not complete.
13) Let x1, x2, x3… xn, be a random sample from N(µ,σ2) population. Find sufficient
estimator for µ and σ2 .
14) Explain the method of minimum chi-square estimation.
15) Find MLE for the parameter λ of a Poisson distribution on the basis of a sample
of size n and hence, obtain the MLE of P[ X ≤ 1].
16) State and prove Factorization Theorem on sufficient statistics in one parameter
discrete case.
17) Obtain the method of moments estimator for Uniform distribution U(a, b).
(P.T.O)
18) Obtain the Bayes estimator using a random sample of size ‘n’ when
f( x ; q ) = , x = 0, 1, 2, . . .
and the p.d.f. of q is a two parameter gamma distribution.
PART-C
Answer any TWO questions: (2×20=40 marks)
- (a) State and prove Chapman- Robins Inequality and also mention its importance.
(b) Obtain the minimum variance bound estimator for µ in normal population
N(µ,σ2) where σ2 is known.
- (a) State and prove Rao-Blackwell Theorem.
(b) Let x1, x2, x3… xn, be random sample from U(0,θ) population obtain
UMVUE for θ.
- (a) Explain the concept of Maximum Likelihood Estimator
(b) In random sampling from normal population N(µ,σ2) find the MLE for
(i) µ when σ2 is known (ii) σ2 when µ is known.
- (a) Explain the concept of Method of Least Squares.
(b) State and prove the necessary and sufficient condition for a parametric function
to be linearly estimable.