LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034
M.Sc. DEGREE EXAMINATION – STATISTICS
|
SECOND SEMESTER – APRIL 2006
ST 2808 – ESTIMATION THEORY
Date & Time : 19-04-2006/FORENOON Dept. No. Max. : 100 Marks
PART – A
Answer ALL questions. Each carries TWO marks. (10 x 2 = 20 marks)
- If the class of unbiased estimators of a parametric function is neither empty nor singleton, then show that the class is uncountable.
- Prove or disprove the uniqueness of UMVUE.
- If δ is a UMVUE and bounded, then show that any polynomial in δ is also a UMVUE.
- State Chapman – Robbin’s inequality.
- Suppose δ is sufficient for Р and Р0 С Р, then show that δ is sufficient for Р0.
- Let S be a sufficient statistic. If likelihood equivalence of x and y in the support A of the random variable X implies S(x) = S(y) x , y Є A , then show that S is minimal sufficient.
- Let X 1 , X2 be a random sample from N ( θ ,1), θ Є R . Verify whether or not (X1 , X 2 ) is complete.
- Give two examples of a Location-Scale family of distributions .
- Define Ancillary Statistic and give an example.
- Give an example of M- estimator Tn of θ which can be thought of as a weighted
average of the sample values with the weights depending on the data.
PART – B
Answer any FIVE questions. Each carries EIGHT marks. (5 x 8 = 40 marks)
- Let X be a discrete random variable with pdf pθ(x) = θ if x = -1 and
pθ(x) = (1 – θ )2θx if x = 0,1,2,…, where 0 < θ < 1. Find the class U0 of
unbiased estimators of ‘0’ and hence find the class Ug of unbiased estimators of
g (θ) = θ , 0< θ < 1.
- Give an example where only constant estimable parametric functions have
UMVUE.
- Give an example of a UMVUE whose variance is greater than
Chapman-Robbins’s Lower Bound.
-
- Let X1 ,…,Xn be a random sample of size n from N(θ , 1) , θ Є R. Using Fisher information, show that α ixi is sufficient iff αi are equal for all i .
- Let X1 ,… ,Xn be a random sample of size n from U (0,θ), θ > 0. Then show that S() = X(n) is minimal sufficient.
- Show that a complete sufficient statistic is minimal sufficient if it exists.
- Let X1 ,… ,Xn be a random sample of size n from B (m,θ), m known and
θ unknown. Show that the joint distribution of (X1 ,…,Xn ) belongs to an
exponential family. Hence find the mgf of Xi.
- Let X ~ N (θ , 1 ), θ Є R , and let the prior distribution of θ be N ( 0 , 1 ).
Find the Bayes estimator of θ when the loss function is
- Squared error
- Absolute error.
PART – C
Answer any TWO questions. Each carries TWENTY marks. (2 x 20 = 40 marks)
19(a). State and prove a necessary and sufficient condition for an estimator in the class ug
to be a UMVUE. (10)
19(b). Derive Chapman – Robbin’s inequality, using covariance inequality. (10)
20(a). Give an example of a family which is not boundedly complete.(10)
20(b). Let X1 ,….., Xn be a random sample from N(μ ,σ2 ), μ Є R, σ > 0. Show that the distribution of ( X1 ,…..,Xn ) belongs to two-parameter exponential family. Hence by using Basu’s theorem, establish the independence of and s2. (10)
21(a). Prove that δ* is D-Optimal estimator of g(θ) iff each component of δ* is a UMVUE. (14)
21(b). Let X1 ,… , Xn be a random sample from N(μ ,σ2 ), μ Є R, σ > 0. Obtain Jackknife estimator of variance σ2 . (6)
22(a). State and prove Lehmann – Scheffe Theorem for convex loss function. (8)
22(b). Let have the p.d.f f ( – ξ ). If δ is a location equivariant estimator , then
show that the bias , risk and variance of δ do not depend of ξ . (6)
22( c ) Let X1 ,… ,Xn be a random sample from N(ξ ,1 ) , ξ Є R . Find the MRE
estimator of ξ when the loss is squared error. (6).
Latest Govt Job & Exam Updates: