Loyola College B.Sc. Statistics Nov 2010 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

   B.Sc. DEGREE EXAMINATION – STATISTICS

FIFTH SEMESTER – NOVEMBER 2010

ST 5504/ST 5500 – ESTIMATION THEORY

 

 

 

Date : 29-10-10                     Dept. No.                                        Max. : 100 Marks

Time : 9:00 – 12:00

PART – A

Answer ALL the questions                                                                                                              [10×2=20]

 

  1. Define an unbiased estimator and give an example.
  2. Define the consistency of an estimator.
  3. When is an estimator called most efficient estimator?   
  4. Define UMVUE.
  5. State any two regularity conditions.
  6. State the invariance property of ML estimator.
  7. Explain ‘posterior’ distribution.
  8. Define Bayes estimator.
  9. State the normal equations associated with the simple regression model.
  10. Define Best Linear unbiased estimator.

PART – B

 Answer any FIVE questions                                                                                                            [5×8=40]

  1. If X1,X2, . . . Xn are random observations on a Bernoulli variate X taking the value 1 with

      probability p and the value 0 with probability (1-p), show that: is a consistent

      estimator of p(1-p).

  1. If X1,X2, . . . Xn are random observations of Poisson population with parameter λ.

      Obtain UMVUE for λ using Lehman Scheffe Theorem.                                                                                    

  1. State and prove Factorization theorem [Neymann].
  2. Write short notes about the method of minimum Chi-square estimation.
  3. Estimate α and p for the following distribution by the methods of moments:

        .

  1. Let X1,X2, . . . Xn be a random sample of size n from a uniform population with p.d.f:

                         f(x; Ө) = 1; Ө-1/2 ≤ x ≤ Ө+1/2, -∞< Ө <∞.

       Obtain M.L.E for Ө.

  1. State and prove Gauss Markoff Theorem.
  2. Explain the method of Least Squares with an illustration.

 

 

PART – C

Answer any TWO questions                                                                                                     [2×20=40]

  1. a) Obtain Cramer Rao Lower bound for an unbiased estimator of if f(x,θ) =

            

  1. b) Establish Chapman-Robbins Inequality.
  2. a) State and prove Rao Blackwell theorem and mention it’s importance.
  3. b) Let X1,X2, . . . Xn be an iid {f Ө, Ө Є Θ} and each Xi, i=1,2,3,4 has mean μ and

          variance σ2. Find the minimum variance unbiased estimator among the following.

                                    T1(x) = X1;

T2(x) = (X1+X2)/2;

T3(x) = (X1+2X2+3X3)/6;

T4(x) =(X1+X2+X3+X4)/4

  1. a) Show that UMVUE is unique.
  2. b) State and prove a sufficient condition for consistency of an estimator.
  3. a) Establish a necessary and sufficient condition for a linear parametric function to be

          Estimable.

  1. b) In Sampling from {b(1, Ө), 0<Ө<1}., Obtain Bayesian estimator of Ө taking a suitable prior

          distribution.

 

 

Go To Main Page

Latest Govt Job & Exam Updates:

View Full List ...

© Copyright Entrance India - Engineering and Medical Entrance Exams in India | Website Maintained by Firewall Firm - IT Monteur