LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034
B.Sc. DEGREE EXAMINATION – STATISTICS
FIFTH SEMESTER – NOVEMBER 2010
ST 5504/ST 5500 – ESTIMATION THEORY
Date : 29-10-10 Dept. No. Max. : 100 Marks
Time : 9:00 – 12:00
PART – A
Answer ALL the questions [10×2=20]
- Define an unbiased estimator and give an example.
- Define the consistency of an estimator.
- When is an estimator called most efficient estimator?
- Define UMVUE.
- State any two regularity conditions.
- State the invariance property of ML estimator.
- Explain ‘posterior’ distribution.
- Define Bayes estimator.
- State the normal equations associated with the simple regression model.
- Define Best Linear unbiased estimator.
PART – B
Answer any FIVE questions [5×8=40]
- If X1,X2, . . . Xn are random observations on a Bernoulli variate X taking the value 1 with
probability p and the value 0 with probability (1-p), show that: is a consistent
estimator of p(1-p).
- If X1,X2, . . . Xn are random observations of Poisson population with parameter λ.
Obtain UMVUE for λ using Lehman Scheffe Theorem.
- State and prove Factorization theorem [Neymann].
- Write short notes about the method of minimum Chi-square estimation.
- Estimate α and p for the following distribution by the methods of moments:
.
- Let X1,X2, . . . Xn be a random sample of size n from a uniform population with p.d.f:
f(x; Ө) = 1; Ө-1/2 ≤ x ≤ Ө+1/2, -∞< Ө <∞.
Obtain M.L.E for Ө.
- State and prove Gauss Markoff Theorem.
- Explain the method of Least Squares with an illustration.
PART – C
Answer any TWO questions [2×20=40]
- a) Obtain Cramer Rao Lower bound for an unbiased estimator of if f(x,θ) =
- b) Establish Chapman-Robbins Inequality.
- a) State and prove Rao Blackwell theorem and mention it’s importance.
- b) Let X1,X2, . . . Xn be an iid {f Ө, Ө Є Θ} and each Xi, i=1,2,3,4 has mean μ and
variance σ2. Find the minimum variance unbiased estimator among the following.
T1(x) = X1;
T2(x) = (X1+X2)/2;
T3(x) = (X1+2X2+3X3)/6;
T4(x) =(X1+X2+X3+X4)/4
- a) Show that UMVUE is unique.
- b) State and prove a sufficient condition for consistency of an estimator.
- a) Establish a necessary and sufficient condition for a linear parametric function to be
Estimable.
- b) In Sampling from {b(1, Ө), 0<Ө<1}., Obtain Bayesian estimator of Ө taking a suitable prior
distribution.
Latest Govt Job & Exam Updates: