LOYOLA COLLEGE (AUTONOMOUS), CHENNAI –600 034
B.Sc., DEGREE EXAMINATION – STATISTICS
FIFTH SEMESTER – NOVEMBER 2004
ST 5500/STA 505/S 515 – ESTIMATION THEORY
25.10.2004 Max:100 marks
9.00 – 12.00 Noon
SECTION – A
Answer ALL the questions (10 ´ 2 = 20 marks)
- State the problem of point estimation.
- Define ‘bias’ of an estimator in estimating a parametric function.
- Define a ‘Consistent estimator’.
- Define ‘efficiency of an estimator.
- Explain ‘Uniformly Minimum Variance Unbiased Estimator’.
- What is Cramer – Rao lower bound?
- Define ‘bounded completeness’.
- Examine if {N (0, s2), s2 > 0} is complete.
- Let X1, X2 denote a random sample of size 2 from B (1, q), 0 < q < 1. Show that X1 + 3X2 is sufficient for q.
- Explain BLUE.
SECTION – B
Answer any FIVE questions. (5 ´ 8 = 40 marks)
- Show that the sample variance is a biased estimator of the population variance s2. Suggest an UBE of s2.
- If Tn is a consistent estimator of j (q), show that there exists infinitely many consistent estimators of j (q).
- State and derive Cramer – Rao inequality.
- Show that UMVUE is essentially unique.
- Give an example to show that bounded completeness does not imply completeness.
- Show that the sample mean is a complete sufficient statistic in the case of P (q), q > 0.
- State and establish Lehmann – Scheffe theorem.
- State and prove ‘Invariance property’ of MLE.
SECTION – C
Answer any TWO questions (2 ´ 20 = 40 marks)
- a) Let f (x;q) =
- , otherwise
Based on a random sample of size n, suggest an UBE of
- q when s is known and
- s when q is known (5+5)
- b) Obtain CRLB for estimating q, in the case of f(x; q) = x Î R, q Î R,
based on a random sample of size n. (10)
- a) State and establish factorization theorem in the discrete case.
- b) Obtain a sufficient statistic for q = (m, s2) based on a random sample of size n from
N (m, s2), m Î R, s2 > 0. (12 + 8)
- a) Explain the method of maximum likelihood.
- b) Let X1, X2, …, Xn denote a random sample of size n from N (m, s2). Obtain MLE of
q = (m, s2). (5 + 15)
- a) Describe the method of minimum chi-square and the method of modified minimum
chi-square.
- b) Describe the linear model in the Gauss – Markov set – up.
- c) Let y = Ab + e be the linear model where E (e) = 0. Show that a necessary and
sufficient condition for the linear function b of the parameters to be linearly
estimable is that rank (A) = rank . (10 + 6 + 4)
Latest Govt Job & Exam Updates: