Loyola College B.Sc. Statistics Nov 2004 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI –600 034

B.Sc., DEGREE EXAMINATION – STATISTICS

FIFTH SEMESTER – NOVEMBER 2004

ST 5500/STA 505/S 515 – ESTIMATION THEORY

25.10.2004                                                                                                           Max:100 marks

9.00 – 12.00 Noon

 

SECTION – A

 

Answer ALL the questions                                                                            (10 ´ 2 = 20 marks)

 

  1. State the problem of point estimation.
  2. Define ‘bias’ of an estimator in estimating a parametric function.
  3. Define a ‘Consistent estimator’.
  4. Define ‘efficiency of an estimator.
  5. Explain ‘Uniformly Minimum Variance Unbiased Estimator’.
  6. What is Cramer – Rao lower bound?
  7. Define ‘bounded completeness’.
  8. Examine if {N (0, s2), s2 > 0} is complete.
  9. Let X1, X2 denote a random sample of size 2 from B (1, q), 0 < q < 1. Show that X1 + 3X2 is sufficient for q.
  10. Explain BLUE.

SECTION – B

 

Answer any FIVE questions.                                                                          (5 ´ 8 = 40 marks)

 

  1. Show that the sample variance is a biased estimator of the population variance s2. Suggest an UBE of s2.
  2. If Tn is a consistent estimator of j (q), show that there exists infinitely many consistent estimators of j (q).
  3. State and derive Cramer – Rao inequality.
  4. Show that UMVUE is essentially unique.
  5. Give an example to show that bounded completeness does not imply completeness.
  6. Show that the sample mean is a complete sufficient statistic in the case of P (q), q > 0.
  7. State and establish Lehmann – Scheffe theorem.
  8. State and prove ‘Invariance property’ of MLE.

 

SECTION – C

 

Answer any TWO questions                                                                          (2 ´ 20 = 40 marks)

 

 

  1. a) Let f (x;q) =

 

  • ,  otherwise

 

Based on a random sample of size n, suggest an UBE of

  1. q when s is known and
  2. s when q is known          (5+5)

 

  1. b) Obtain CRLB for estimating q, in the case of f(x; q) = x Î R, q Î R,

based on a random sample of size n.                                                                          (10)

  1. a) State and establish factorization theorem in the discrete case.

 

  1. b) Obtain a sufficient statistic for q = (m, s2) based on a random sample of size n from

N (m, s2), m Î R, s2 > 0.                                                                                     (12 + 8)

 

  1. a) Explain the method of maximum likelihood.

 

  1. b) Let X1, X2, …, Xn denote a random sample of size n from N (m, s2). Obtain MLE of

q  = (m, s2).                                                                                                           (5 + 15)

 

  1. a) Describe the method of minimum chi-square and the method of modified minimum

chi-square.

 

  1. b) Describe the linear model in the Gauss – Markov set – up.

 

  1. c) Let y = Ab + e be the linear model where E (e) = 0. Show that a necessary and

sufficient condition for the linear function  b of the parameters to be linearly

estimable is that rank (A) = rank .                                                           (10 + 6 + 4)

 

 

Go To Main page

 

 

 

Latest Govt Job & Exam Updates:

View Full List ...

© Copyright Entrance India - Engineering and Medical Entrance Exams in India | Website Maintained by Firewall Firm - IT Monteur