Loyola College M.Sc. Statistics April 2004 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI –600 034

M.Sc., DEGREE EXAMINATION – STATISTICS

SECOND SEMESTER – APRIL 2004

ST 2801 – ESTIMATION THEORY

05.04.2004                                                                                                           Max:100 marks

1.00 – 4.00

 

SECTION – A

 

Answer ALL the questions                                                                          (10 ´ 2 = 20 marks)

 

  1. What is the problem of point estimation?
  2. Show that UMVUE of a given parametric function is unique almost surely.
  3. Define QA – optimality criterion.
  4. Let X ~ N ( 0, s2), s > 0. Find a minimal sufficient statistic.
  5. Classify the following as location none of the two:
  6. a) BVN (0,0, q, q, 1/2) b) BVN (q, 0, 1, 1, 0.6).
  7. State Rao-Blackwell theorem.
  8. Define Exponential family.
  9. Let X ~ B (n, p), n = 2, 3 and p = . Obtain MLE of (n, p) based on X.
  10. Define scale equivariant estimator.
  11. Explain Minimax estimation.

 

SECTION – B

 

Answer any FIVE questions                                                                          (5 ´ 8 = 40 marks)

 

  1. Find the Jackknified estimator of m2 in the case of f(x) = , x ≥ m; m Î

 

  1. State and establish Basu’s theorem.

 

  1. Let X1, X2, …, Xn be a random sample from N (m, s2), m Î R, s > 0. Find UMRUE of

(m, m/s) with respect to any loss function which is convex in its second argument.

 

  1. Let X1, X2,…, Xn be iid U (q – , q + ), q Π Find minimal sufficient statistic and examine whether it is boundedly complete.

 

  1. Given a random sample of size n from N (m, s2), mÎR, s > 0, find Cramer – Rao lower bound for estimating ( m/s2).  Compare it with the variance of UMVUE.

 

  1. State and establish the invariance property of CAN estimator.

 

  1. Given a random sample from a location family with the location parameter x, show that is MREE of with respect to any invariant loss function, where dO is an LEE,  = and * minimizes Eo { P (½with respect to .

 

  1. Let X ~ N (q, 1), qÎ Find Bayes estimator of q with respect to squared error loss if the prior of q is N (0, 1).

 

 

 

SECTION – C

 

Answer any TWO questions                                                                        (2 ´ 20 = 40 marks)

 

  1. a) Give an example for each of the following:
  2. i) Ug is empty ii) Ug is singleton.

 

  1. b) Let X be DU {1,2,…, N}, N = 1,2,3,4,… . Find QA – optimal estimator of (N, N2).

(12+8)

  1. a) Show that a vector unbiased estimator is D – optimal if and only if each of its

components is a UMVUE.

 

  1. b) State and establish Lehmann – Scheffe theorem. (12+8)

 

  1. a) Let X1, X2, …, Xn be iid N (0, ),    Find MREE of    r with respect to

standardized squared error loss.

 

  1. b) Let (X , Yi), i = 1,2, …, n be a random sample from ACBVE distribution with pdf.

f(x,y) = {(2a+b) (a + b) / 2}  exp {-a(x+y) – b max.  (x,y)}, x, y > 0.

Find i) MLE of ( a, b)  and (ii) examine whether the MLE is consistent.           (8+8+4)

 

  1. Write short notes on:-
  2. Jackknifing method.
  3. Fisher information
  4. Location – scale family    (10+5+5)

 

Go To Main page

 

 

 

 

 

 

 

© Copyright Entrance India - Engineering and Medical Entrance Exams in India | Website Maintained by Firewall Firm - IT Monteur