Loyola College B.Sc. Statistics April 2008 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

           B.Sc. DEGREE EXAMINATION – STATISTICS

NO 24

FIFTH SEMESTER – APRIL 2008

ST 5500 – ESTIMATION THEORY

 

 

 

Date : 28-04-08                  Dept. No.                                        Max. : 100 Marks

Time : 1:00 – 4:00

PART-A

 

Answer ALL the questions:                                                             (10×2=20)

 

  1. Define ‘bias’ of an estimator.
  2. When do you say an estimator is consistent?
  3. Define a sufficient statistic.
  4. What do you mean by bounded completeness?
  5. Describe method of moments in estimation.
  6. State invariance property of maximum likelihood estimator.
  7. Define Loss function and give an example.
  8. Explain ‘prior distribution’ and ‘posterior distribution’.
  9. Explain least square estimation.
  10. Mention any two properties of least squares estimator.

 

PART-B

Answer any FIVE questions:                                                           (5×8=40)

 

  1. If Tn is asymptotically unbiased with variance approaching 0 as , then show that Tn is consistent.
  2. Show that is an unbiased estimate of , based on a random sample drawn from .
  3. Let be a random sample of size n from  population. Examine if is complete.
  4. State and prove RAo-Blackwell theorem.
  5. Estimate by the method of moments in the case of Pearson’s Type III distribution with p.d.f .
  6. State and establish Bhattacharya inequality.
  7. Describe the method of modified minimum Chi square.
  8. Write a note on Baye’s estimation.

 

PART-C

Answer any TWO questions:                                                           (10×2=20)

  1. a) and is a random sample of size 3 from a population with mean value and variance .  are the estimators used to estimate mean value , where  and .
  1. Are T1 and T2 unbiased estimators?
  2. Find the value of such that T3 a consistent estimator?
  • With this value of is T3 a consistent estimator?
  1. Which is the best estimator?
  2. b) If are random observations on a Bernoulli variate X taking the value 1 with probability p and the value 0 with probability (1-p), show that is a consistent estimator of p(1-p).
  1. a) State and Prove cramer-Rao inequality.
  1. b) Given the probability density function

Show that the Cramer-Rao-lower bound of variance of an unbiased estimator of  is 2/n, where n is the size of the random sample from this distribution. [12+8]

  1. a) State and prove Lehmann – Scheffe theorem
  1. b) Obtain MLE of in based on an independent sample of size n. Examine whether this estimate is sufficient for .                   [12+8]
  1. a) Show that a necessary and sufficient condition for the linear parametric function to be linearly estimable is that

ank (A) = rank

where  and

  1. b) Describe Gauss – Markov model [12+8]

 

 

Go To Main page

Latest Govt Job & Exam Updates:

View Full List ...

© Copyright Entrance India - Engineering and Medical Entrance Exams in India | Website Maintained by Firewall Firm - IT Monteur