Loyola College M.Sc. Statistics April 2004 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI –600 034

M.Sc., DEGREE EXAMINATION – STATISTICS

SECOND SEMESTER – APRIL 2004

ST 2801 – ESTIMATION THEORY

05.04.2004                                                                                                           Max:100 marks

1.00 – 4.00

 

SECTION – A

 

Answer ALL the questions                                                                          (10 ´ 2 = 20 marks)

 

  1. What is the problem of point estimation?
  2. Show that UMVUE of a given parametric function is unique almost surely.
  3. Define QA – optimality criterion.
  4. Let X ~ N ( 0, s2), s > 0. Find a minimal sufficient statistic.
  5. Classify the following as location none of the two:
  6. a) BVN (0,0, q, q, 1/2) b) BVN (q, 0, 1, 1, 0.6).
  7. State Rao-Blackwell theorem.
  8. Define Exponential family.
  9. Let X ~ B (n, p), n = 2, 3 and p = . Obtain MLE of (n, p) based on X.
  10. Define scale equivariant estimator.
  11. Explain Minimax estimation.

 

SECTION – B

 

Answer any FIVE questions                                                                          (5 ´ 8 = 40 marks)

 

  1. Find the Jackknified estimator of m2 in the case of f(x) = , x ≥ m; m Î

 

  1. State and establish Basu’s theorem.

 

  1. Let X1, X2, …, Xn be a random sample from N (m, s2), m Î R, s > 0. Find UMRUE of

(m, m/s) with respect to any loss function which is convex in its second argument.

 

  1. Let X1, X2,…, Xn be iid U (q – , q + ), q Π Find minimal sufficient statistic and examine whether it is boundedly complete.

 

  1. Given a random sample of size n from N (m, s2), mÎR, s > 0, find Cramer – Rao lower bound for estimating ( m/s2).  Compare it with the variance of UMVUE.

 

  1. State and establish the invariance property of CAN estimator.

 

  1. Given a random sample from a location family with the location parameter x, show that is MREE of with respect to any invariant loss function, where dO is an LEE,  = and * minimizes Eo { P (½with respect to .

 

  1. Let X ~ N (q, 1), qÎ Find Bayes estimator of q with respect to squared error loss if the prior of q is N (0, 1).

 

 

 

SECTION – C

 

Answer any TWO questions                                                                        (2 ´ 20 = 40 marks)

 

  1. a) Give an example for each of the following:
  2. i) Ug is empty ii) Ug is singleton.

 

  1. b) Let X be DU {1,2,…, N}, N = 1,2,3,4,… . Find QA – optimal estimator of (N, N2).

(12+8)

  1. a) Show that a vector unbiased estimator is D – optimal if and only if each of its

components is a UMVUE.

 

  1. b) State and establish Lehmann – Scheffe theorem. (12+8)

 

  1. a) Let X1, X2, …, Xn be iid N (0, ),    Find MREE of    r with respect to

standardized squared error loss.

 

  1. b) Let (X , Yi), i = 1,2, …, n be a random sample from ACBVE distribution with pdf.

f(x,y) = {(2a+b) (a + b) / 2}  exp {-a(x+y) – b max.  (x,y)}, x, y > 0.

Find i) MLE of ( a, b)  and (ii) examine whether the MLE is consistent.           (8+8+4)

 

  1. Write short notes on:-
  2. Jackknifing method.
  3. Fisher information
  4. Location – scale family    (10+5+5)

 

Go To Main page

 

 

 

 

 

 

 

Loyola College M.Sc. Statistics April 2006 Estimation Theory Question Paper PDF Download

             LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

M.Sc. DEGREE EXAMINATION – STATISTICS

SECOND SEMESTER – APRIL 2006

                                                        ST 2808 – ESTIMATION THEORY

 

 

Date & Time : 19-04-2006/FORENOON     Dept. No.                                                       Max. : 100 Marks

 

 

PART – A

 

Answer  ALL  questions.  Each  carries TWO  marks.     (10 x 2 =  20 marks)

 

  1. If the class of unbiased estimators of  a parametric function is neither empty nor singleton, then show that the class is uncountable.
  2. Prove or disprove the uniqueness of UMVUE.
  3. If  δ  is a UMVUE and bounded, then  show that any polynomial in δ  is also a UMVUE.
  4. State Chapman – Robbin’s  inequality.
  1. Suppose δ  is sufficient for Р  and  Р0  С  Р,  then show that δ  is sufficient for Р0.
  1. Let S be a sufficient statistic. If  likelihood equivalence of x and y in the support A of the random variable X implies S(x) = S(y)   x , y  Є  A , then show that S is minimal sufficient.
  2. Let X 1  , X2   be a random sample from N ( θ ,1),   θ Є R . Verify whether or not   (X1  , X 2 ) is complete.
  3. Give two examples of a  Location-Scale family of distributions .
  4. Define Ancillary Statistic and give an example.
  5. Give an example of  M- estimator Tn  of  θ  which can be thought of as a weighted

average of the sample values with the weights depending on the data.

 

PART – B

 

Answer  any FIVE  questions.  Each  carries EIGHT marks.     (5 x 8 =  40 marks)

 

  1. Let X be a discrete random variable with  pdf  pθ(x)  =  θ  if  x  =  -1 and

pθ(x)  =  (1 – θ )2θx   if  x  =  0,1,2,…, where 0 < θ <  1. Find the class U0 of

unbiased estimators of ‘0’ and hence find the class Ug of unbiased estimators of

g (θ)  =  θ , 0< θ < 1.

  1. Give an example where only constant estimable parametric functions have

UMVUE.

  1. Give an example of a UMVUE whose variance is greater than

Chapman-Robbins’s  Lower Bound.

    1. Let X1 ,…,Xn   be a random sample of size n from N(θ , 1) ,  θ Є R. Using Fisher information, show that   α ixi  is sufficient iff  αare equal for all i .
    2. Let X1 ,… ,Xn   be a random sample of size n from U (0,θ), θ  > 0. Then show that S() = X(n)  is minimal sufficient.
    3. Show that a complete sufficient statistic is minimal sufficient if it exists.

 

  1.  Let X1 ,… ,Xn   be a random sample of size n from B (m,θ),  m known and

θ unknown.   Show that the joint distribution of  (X1 ,…,Xn  ) belongs to an

exponential family. Hence find the mgf of   Xi.

  1. Let X ~ N (θ , 1 ), θ Є R , and let the prior distribution of θ be N ( 0 , 1 ).

Find the Bayes estimator of θ when the loss function is

  • Squared error
  • Absolute error.

 

 

PART – C

 

Answer  any TWO  questions.  Each  carries TWENTY marks.     (2 x 20 =  40 marks)

 

19(a). State and prove a necessary and sufficient condition for an estimator in the class ug

to  be a  UMVUE.  (10)

19(b). Derive Chapman – Robbin’s inequality, using covariance inequality. (10)

 

20(a). Give an example of a family which is not boundedly complete.(10)

20(b). Let X1  ,….., Xn   be a random sample from N(μ ,σ2 ), μ Є R, σ  > 0. Show that the distribution of ( X1 ,…..,Xn  ) belongs to two-parameter exponential family. Hence by using Basu’s theorem, establish the independence of   and s2. (10)

21(a). Prove that  δ*  is  D-Optimal estimator of g(θ)  iff each component of δ*  is a            UMVUE.  (14)

21(b). Let X1  ,… , Xn  be a random sample from N(μ ,σ2 ), μ Є R, σ  > 0. Obtain Jackknife estimator of variance   σ2  .    (6)

 

22(a). State and prove Lehmann – Scheffe Theorem for convex loss function. (8)

22(b). Let  have the p.d.f    f ( – ξ ).  If   δ  is a location equivariant estimator ,  then

show that the bias ,  risk and variance of  δ  do not depend of  ξ  .  (6)

22( c )  Let X1  ,… ,Xn  be a random sample from N(ξ  ,1 ) ,   ξ Є R .  Find the MRE

estimator of  ξ   when the loss is squared error.  (6).

 

 

Go To Main page

 

Loyola College M.Sc. Statistics April 2007 Estimation Theory Question Paper PDF Download

     LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

M.Sc. DEGREE EXAMINATION – STATISTICS

AC 32

SECOND SEMESTER – APRIL 2007

ST 2808/2806/2801 – ESTIMATION THEORY

 

 

 

Date & Time: 17/04/2007 / 1:00 – 4:00 Dept. No.                                              Max. : 100 Marks

 

 

SECTION – A

Answer all the questions                                                                                   (10 x 2 = 20)

 

  1. Explain the problem of Point estimation.
  2. Give two examples of loss function for simultaneous estimation.
  3. If δ is a UMVUE, then show that δ + 2 is also a UMVUE.
  4. Define Fisher information in the multi-parameter case.
  5. Define minimal sufficient statistic.
  6. Give an example of a family of distributions which is not complete.
  7. Give two examples of scale equivariant estimator.
  8. Let X follow B(1, θ), θ = 0.1,0.2. Find MLE of θ .
  9. Given a random sample from DU{1,2,…, N}, N ε I+, find a consistent estimator of N.
  10. Explain Bayes estimation.

 

SECTION – B

‌‌‌Answer any  five questions                                                                                (5 x 8 = 40)

 

  1. If δ0 is an unbiased estimator of g, show that the class of unbiased estimators of g is

{ δ0 + u‌‌ │‌‌u ε U0}.

  1. Given a random sample from N(μ, σ2), μ ε R , σ > 0, find Cramer-Rao lower bound for

estimating  σ/ μ.

  1. State and establish Bhattacharya inequality.
  2. Let X1,X2,…,Xn be a random sample from U(θ – 1, θ + 1), θ ε R. Show that

(X(1), X(n)) is minimal sufficient but not complete.

  1. State and establish Basu’s theorem.
  2. Given a random sample from E(ξ,1), ξ ε R, find MREE of ξ with respect to i) squared error loss and
  3. ii) absolute error loss.
  4. State and prove the theorem providing MREE of a scale parameter.
  5. Given a random sample from U(0, θ), θ ε R, show that MLE is not CAN. Suggest a CAN estimator.

 

SECTION – C

Answer any two questions                                                                               (2 x 20 = 40)

 

19 a) State and establish Cramer-Rao inequality for the multiparameter case.

  1. b) Let X follow DU{1,2,…,N}, N = 3,4,… Find the UMVUE of g(N). Hence find the UMVUE of N.

20 a) Show that an estimator is QA – optimal if and only if it is D – optimal.

  1. b) Given a random sample from E(ξ, τ), ξ ε R, τ > 0, find UMRUE of (ξ , ξ + τ) with

respect to any loss function, convex in the second argument.

 

21 a) Discuss the problem of equivariant estimation of the percentiles of a location – scale model.

  1. b) Given a random sample of size n from N(μ, τ2), μ ε R, τ > 0, find MREE of (μ+3τ) with respect to

standardized squared error loss.

22 a) State and establish invariance property of CAN estimator.

  1. b) Let (Xi,Yi) , i= 1,2,…,n be a random sample from a bivariate distribution with pdf

 

 

Find MLE of

 

Go To Main Page

Loyola College M.Sc. Statistics April 2008 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

M.Sc. DEGREE EXAMINATION – STATISTICS

NO 36

SECOND SEMESTER – APRIL 2008

ST 2808 – ESTIMATION THEORY

 

 

 

Date : 17/04/2008            Dept. No.                                        Max. : 100 Marks

Time : 1:00 – 4:00

SECTION – A                            Answer all the questions                                   (10 x 2 = 20)

  1. Give an example of a parametric function for which unbiased estimator does not exist.
  2. Define a loss function for simultaneous estimation problem and give an example.
  3. If δ is a UMVUE, then show that 5δ is also a UMVUE.
  4. Find the Fisher information in the Bernoulli distribution with the parameter θ.
  5. Define completeness and bounded completeness.
  6. Given a random sample of size 2 from N(0, σ2), σ>0, suggest two ancillary statistics.
  7. Give two examples for location equivariant estimator.
  8. Let X follow E( θ,1), θ = 0.1,0.2. Find the MLE of θ .
  9. Define a consistent estimator and give an example.
  10. Explain prior distribution and Conjugate family.

 

SECTION – B                                Answer any  five questions                      (5 x 8 = 40)

‌11. Let X follow DU{1,2,…N}, N = 2,3,4,… Find the class of unbiased estimators of  N .

  1. State and prove Cramer-Rao inequality for the multiparameter case.
  2. Discuss the importance of Bhattacharyya inequality with a suitable example.
  3. Let X1,X2,…,Xn be a random sample from N(θ, θ2), θ >0. Find a minimal sufficient statistic and

examine whether it is complete.

  1. Using Basu’s theorem show that the sample mean and the sample variance are independent in the

case of  N( θ, 1), θ ε R.

16.Given a random sample from E(0, τ), τ > 0, find MREE of τ and τ2 with respect to  standardized

squared error loss.

17.Give an example in which MREE of a location parameter exists with respect to squared error loss but

UMVUE does not exist.

  1. Let X1,X2,…,Xn be a random sample from B(1, θ), 0<θ<1. If the prior distribution is U(0,1), find the

Bayes estimator of θ with respect to the squared error loss.

 

SECTION – C                   Answer any two questions                                       (2 x 20 = 40)

19 a) State and establish Bhattacharya inequality

  1. b) Let X follow DU{1,2,…,N}, N = 3,4,…Find the UMVUE of N using Calculus approach.

20 a) Show that an estimator δ is QA – optimal if and only if each component of δ is a UMVUE.

  1. b) Given a random sample from N(μ,σ2), μ ε R, σ > 0, find UMRUE of (μ, μ/σ) with

respect to any loss function, convex in the second argument.

21 a) Discuss the problem of equivariant estimation of the scale parameter.

  1. b) Given a random sample of size n from U(ξ, ξ+1), ξ ε R,find the MREE of ξ with respect to

standardized squared error loss.

22 a) Give an example for an MLE which is consistent but not CAN.

  1. b) Stating the regularity conditions, show that the likelihood equation estimator is CAN.

 

 Go To Main page

 

 

Loyola College M.Sc. Statistics April 2009 Estimation Theory Question Paper PDF Download

    LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

M.Sc. DEGREE EXAMINATION – STATISTICS

YB 36

SECOND SEMESTER – April 2009

ST 2811 / 2808 – ESTIMATION THEORY

 

 

 

Date & Time: 20/04/2009 / 1:00 – 4:00       Dept. No.                                                       Max. : 100 Marks

 

 

SECTION – A                                  Answer all the questions                                   (10 x 2 = 20)

 

01.Give an example of a parametric function for which unbiased estimator is unique.

02.State any two loss functions for simultaneous estimation problem.

03.Show that UMVUE of a parametric function is unique.

04.Define Fisher information for the multiparameter situation.

05.Define bounded completeness and give an example.

06.Given a random sample of size 2 from E(0, σ), σ>0, suggest two ancillary statistics.

07.Define a scale equivariant estimator and give an example.

08.Let X follow N( θ,1), θ = 0, 0.1. Find the MLE of  θ .

09.If  δ is consistent for θ, show that there exists infinitely many consistent estimators of θ.

10.Describe Conjugate family and give an example.

 

SECTION – B     ‌‌                              Answer any  five questions                                (5 x 8 = 40)

 

‌11.Let X follow DU{1,2,…N}, N = 2,3. Find the class of unbiased estimators of  zero.

Hence find the class of unbiased estimators of N and N2.

12.State Cramer-Rao inequality for the multiparameter case. Hence find the Cramer- Rao

lower bound for estimating  σ/μ  based on a random sample from N(μ,σ2), μ ε R, σ > 0.

  1. Discuss the importance of Fisher information in finding a sufficient statistic.
  2. Let X1,X2,…,Xn be a random sample from U(0, θ), θ >0. Find a minimal sufficient

statistic and examine whether it is complete.

15.State and establish Basu’s theorem.

16.Given a random sample from N(0, τ2), τ > 0, find MREE of τ 2  with respect

to  standardized squared error loss. Is it unbiased ?

17.Find  MREE of the location parameter with respect to absolute error loss based on a

random sample from E(ξ, 1), ξ ε R.

  1. Let X1,X2,…,Xn be a random sample from P(θ), θ > 0. If the prior distribution is E(0,1),

find the Bayes estimator of θ with respect to the squared error loss.

 

SECTION – C                              Answer any two questions                                    (2 x 20 = 40)

 

19 a) State and establish any two properties of Fisher information.

  1. b) Let X have the pdf

P( X = x) = (1- θ)2 θx , x = 0,1,…  ; 0< θ < 1

=  θ,  x = -1.

Using Calculus approach examine whether UMVUE of the following parametric functions

exist:  i) θ     ii) (1 – θ)2.

20 a) Show that an estimator δ is D – optimal if and only if each component of δ is a UMVUE.

  1. b) Given a random sample from E(μ,σ), μ ε R, σ > 0, find UMRUE of (μ, μ + σ) with

respect to any loss function, convex in the second argument.

21 a) Show that the bias and the risk associated with a location equivariant estimator do not depend

on the parameter.

  1. b) Show that a location equivariant estimator δ is an MREE if and only if E0(δu) = 0 for each

invariant function u.
22 a) Given a random sample from N(μ,σ2), μ ε R, σ > 0, find the maximum likelihood

estimator of (μ,σ2). Examine whether it is consistent.

  1. b) Stating the regularity conditions, show that the likelihood equation admits a solution which

is consistent.

 

 

Go To Main page

Loyola College M.Sc. Statistics April 2010 Estimation Theory Question Paper PDF Download

Go To Main page

Loyola College M.Sc. Statistics April 2011 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

M.Sc. DEGREE EXAMINATION – STATISTICS

SECOND SEMESTER – APRIL 2011

ST 2811 / 2808 – ESTIMATION THEORY

 

 

 

Date : 2/4/2011                Dept. No.                                        Max. : 100 Marks

Time : 1:00 – 4:00

 

SECTION – A

 

Answer all the questions                                                                                           (2×10=20)

  1. Define Minimal Sufficient Statistic
  2. Define Efficient Estimator
  3. Define Ancillary Statistic
  4. State the different approaches to identify UMVUE
  5. Define Likelihood Equivalence
  6. Define D –optimality
  7. Define Location-Scale Family
  8. Define Minimum Risk Equivariant Estimator(MREE)
  9. Define CAN estimator
  10. Define Maximum Likelihood Estimator

 

SECTION – B

 

Answer any five questions                                                                                        (5×8 = 40)

  1. Obtain UMVUE of θ(1- θ) using a random sample of size n drawn from a Bernoullie population with parameter θ
  2. State and Establish Rao-Blackwell theorem
  3. State and Establish Neyman-Fisher Factorization theorem
  4. i) Let L be squared error then MREE of θ is unique                                        (4)
  5. ii) Let X1,X2,…,Xn be a random sample from N(θ,1), Show that (4)
  6. Let δ be a LEE and L be invariant then show that    i)The Bias of δ is free from θ

and ii) Risk of δ is free from θ                                                                                    (4+4)

  1. i) State and Establish Basu’s theorem (6+2)
  2. ii) Define UMRUE
  3. Determine MREE of θ in the following cases i) N(θ,1) , θ Î R ii)E(θ,1) , θ ÎR

 

 

 

  1. Let X1,X2,…,Xn be a random sample from population having pdf

 

obtain MLE of P(X>2)

 

SECTION – C

Answer any two questions                                                                                        (2×20 = 40)

  1. i) Establish: If UMVUE exists for a parametric function Ψ(θ), It has to be essentially unique (10)
  2. ii) State and Establish Cramer-Rao Inequality for multi-parameter case and hence deduce the inequality for single parameter (10)
  3. Establish: δ*Î Ug is D-optimal if and only if each component of δ* is UMVUE
  4. i) Let X1,X2,…,Xn be a random sample from N(µ,σ2). Obtain Cramer-Rao lower bound for estimating (16)
  5. i) µ ii) σ2                 iii) µ+σ                                 iv) σ/ µ
  6. ii) Establish: Let T be a sufficient statistic such that T(x) = T(y) then           (4)
  7. i) Establish: Let δ* belong to the class of LEEs. Then δ* is a MREE with respect to squared error if and

only if E(δ*u)=0                                                                                           (10)

  1. ii) Let X1,X2,…,Xn be a random sample drawn from a normal population with mean θ and variance σ2

Find the MLE of θ and σ2 when both θ and σ2 are unknown                                (10)

 

 

Go To Main Page

Loyola College M.Sc. Statistics April 2012 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

M.Sc. DEGREE EXAMINATION – STATISTICS

SECOND SEMESTER – APRIL 2012

ST 2811 / 2808 – ESTIMATION THEORY

 

 

Date : 17-04-2012             Dept. No.                                        Max. : 100 Marks

Time : 9:00 – 12:00

SECTION – A

Answer all the Questions:                                                                                              (2×10=20 Marks)

  1. State the Methods of Obtaining UMVUE
  2. State the Invariance Property of MLE
  3. State Neyman-Fisher Factorization Theorem
  4. Provide an example to prove that an unbiased estimator need not be unique
  5. Define Sufficient Statistic and Provide an Example
  6. Define Bayesian Estimator
  7. State the use of Rao-Blackwell Theorem
  8. Define T-Optimality
  9. Provide the large sample behavior of Maximum Likelihood Estimator
  10. Define Best Linear Unbiased Estimator

SECTION – B

Answer any Five Questions:                                                                                     (5×8=40 Marks)

  1. State and Prove the necessary and sufficient condition for unbiased estimator to be UMVUE
  2. State and Prove Cramer-Rao Inequality for Multi-parameter case and hence

establish the inequality for the case of single parameter

  1. State and Prove Neyman-Fisher Factorization theorem
  2. Let X1,X2,…,Xn be a random sample of size n from uniform distribution U(0,θ),

Y=max{ X1,X2,…,Xn} show that is an Unbiased Estimator of θβ. Where β is a

positive constant

 

  1. State and Prove Rao-Blackwell Theorem.
  2. Let Y1,Y2,Y3,Y4 be random variable with E(Y1) = E(Y2)= θ1+ θ2 , E(Y3) = E(Y2)= θ1+ θ3 determine the estimability of the following linear parametric functions
  3. i) 2θ1+ θ2+ θ3 ii)  θ32         iii)  θ1         iv)  3θ1+ θ2+2 θ3
  4. Let X1,X2,…,Xn be a random sample of size n from N(μ,σ2) obtain (1-α)%

confidence interval for σ2 using the large sample behavior of MLE

  1. Find the Bayes Estimator of parameter p of a Binomial Distribution with X successes

out of n trials given that the prior distribution of p is a Beta distribution with

parameter α and β.

 

SECTION – C

Answer any two questions:                                                                                           (2×20=40Marks)

 

  1. i. Establish: If UMVUE exists for a parametric function , It must be essentially

unique.

  1. Obtain UMVUE of θ(1-θ) using a random sample of size n from B(1,θ).

 

  1. Let X1,X2,…,Xn be a random sample from N(µ,σ2). Find Cramer-Rao lower bound for

estimating   a) µ        b) σ2       c) µ+ σ       d)

  1. Define Consistent Estimator and Establish the sufficient condition for Consistency.

 

  1. Establish: δ*Ug is QA-Optimal if and only if each component of δ* is UMVUE.

 

  1. Let X1,X2,…,Xn be a random sample from N(μ,σ2),μR, σ2>0. Obtain MLE of (μ,σ2)
  2. Explain Bootstrap and Jackknife Methods.

 

 

Go To Main Page

 

 

Loyola College B.Sc. Statistics Nov 2003 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI –600 034

B.Sc., DEGREE EXAMINATION – STATISTICS

FIFTH SEMESTER – NOVEMBER 2003

ST-5500/STA 505/S 515 – ESTIMATION THEORY

03.11.2003                                                                                                           Max:100 marks

1.00 – 4.00

SECTION-A

Answer ALL questions.                                                                                   (10×2=20 marks)

 

  1. State the problem of point estimation.
  2. Define ‘bias’ of an estimator in estimating a parametric function.
  3. Define a ‘Uniformly Minimum Variance Unbiased Estimator’ (UMVUE).
  4. Explain Cramer-Rao lower bound.
  5. Define completeness and bounded completeness.
  6. Examine if is complete.
  7. Let X1, X2 denote a random sample of size 2 from B(1, q), 0<q<1. Show that X1+3X2 is sufficient of q.
  8. Give an example where MLE is not unique.
  9. Define BLUE
  10. State Gauss – Markoff theorem.

 

SECTION-B

Answer any FIVE questions.                                                                           (5×8=40 marks)

 

  1. Show that the sample variance is a biased estimator of the population variance. Suggest an UBE of .
  2. If Tn is asymptotically unbiased with variance approaching zero as n , show that Tn is consistent.
  3. Show that UMVUE is essentially unique.
  4. Show that the family of Binomial distributions is complete.
  5. State and establish Lehmann – Scheffe theorem.
  6. State and prove Chapman – Robbin’s inequality.
  7. Give an example where MLE is not consistent.
  8. Describe the linear model in the Gauss – Marboff set-up.

 

SECTION-C

Answer any TWO questions.                                                                           (2×20=40 marks)

 

  1. a) Let X1, X2,….., Xn (n > 1) be a random sample of size n from P (q), q > 0. Show that the class of unbiased estimator of q is uncountable.
  2. b) Let X1, X2,….., Xn denote a random sample of size n from a distribution with pdf

 

 

 

f(x) ;q) =

0              ,   other wise.

Show that X(1) is a consistent estimator of q.                                                          (10+10)

 

  1. a) Obtain CRLB for estimating q, in the case of

f  based on random sample of size n.

  1. b) State and establish factorization theorem in the discrete case. (8+12)
  2. a) Explain the method of maximum likelihood.
  3. b) Let X1, X2, …., Xn denote a random sample of size n from N (. Obtain MLE of q = (.                                                                                           (5+15)
  4. a) Let Y = Ab + e be the linear model where E (e) = 0. Show that a necessary and sufficient condition for the linear function  of the parameters to be linearly estimable is that rank (A) = rank .
  5. b) Explain Bayesian estimation procedure with an example. (10+10)

Go To Main page

 

 

 

 

 

 

Loyola College B.Sc. Statistics April 2004 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI –600 034

B.Sc., DEGREE EXAMINATION – STATISTICS

FIFTH SEMESTER – APRIL 2004

ST 5500/STA 505/S 515 – ESTIMATION THEORY

03.04.2004                                                                                                           Max:100 marks

1.00 – 4.00

SECTION – A

Answer ALL the questions                                                                          (10 ´ 2 = 20 marks)

 

  1. Define ‘bias’ of an estimator in estimating a parametric function.
  2. Explain ‘consistent estimator’
  3. Describe ‘efficiency’ of an estimator.
  4. Define ‘Uniformly Minimum Variance Unbiased Estimator’.
  5. Define Cramer – Rao lower bound (CRLB).
  6. Explain bounded completeness.
  7. Define complete sufficient statistic.
  8. Let X1, X2 denote a random sample of size 2 from B(1, q), 0< q <1. Show that X1 + 3X2 is sufficient for q.
  9. State Chapman – Robbins inequality.
  10. Give an example where MLE is not unique.

 

SECTION – B

Answer any FIVE questions.                                                                                   (5 ´ 8 = 40 marks)

 

  1. Show that the sample variance is a biased estimator of the population Variance. Suggest an UBE of s2.
  2. State and derive Cramer – Rao inequality.
  3. Let T1 and T2 be two unbiased estimators of a parametric function with finite variances. Obtain the best unbiased linear combination of T1 and T2.
  4. State and establish Rao – Blackwell theorem.
  5. Give an example of an UMVUE which does not take values in the range of the parametric function.
  6. State and prove Bhattacharya inequality.
  7. State and prove invariance property of MLE.
  8. Describe the method of moments and illustrate with an example.

 

SECTION – C

Answer any TWO questions                                                                       (2 ´ 20 = 40 marks)

 

  1. a) If Tn is consistent for y (q) and g is continuous, show that g (Tn) is consistent for

g (y (q)).

  1. b) Show that UMVUE is essentially unique. (10+10)

 

  1. a) Give an example to show that bounded completeness does not imply completeness.
  2. State and establish factorization theorem in the discrete case.      (10+10)

 

  1. a) Explain the method of maximum likelihood.
  2. b) Let X1, X2, …, Xn denote a random sample of size n from N (m, s2). Obtain MLE of

q = (m, s2).                                                                                                              (5+15)

 

  1. a) Describe the method of minimum chi-square and method of modified minimum chi-

square.

  1. b) Obtain the estimate of p based on a random sample of size n from B (1, p),0 < p <1 by the method of i) Minimum chi-square and ii) Modified Minimum Chi-Square. (10+10)

 

Go To Main page

Loyola College B.Sc. Statistics Nov 2004 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI –600 034

B.Sc., DEGREE EXAMINATION – STATISTICS

FIFTH SEMESTER – NOVEMBER 2004

ST 5500/STA 505/S 515 – ESTIMATION THEORY

25.10.2004                                                                                                           Max:100 marks

9.00 – 12.00 Noon

 

SECTION – A

 

Answer ALL the questions                                                                            (10 ´ 2 = 20 marks)

 

  1. State the problem of point estimation.
  2. Define ‘bias’ of an estimator in estimating a parametric function.
  3. Define a ‘Consistent estimator’.
  4. Define ‘efficiency of an estimator.
  5. Explain ‘Uniformly Minimum Variance Unbiased Estimator’.
  6. What is Cramer – Rao lower bound?
  7. Define ‘bounded completeness’.
  8. Examine if {N (0, s2), s2 > 0} is complete.
  9. Let X1, X2 denote a random sample of size 2 from B (1, q), 0 < q < 1. Show that X1 + 3X2 is sufficient for q.
  10. Explain BLUE.

SECTION – B

 

Answer any FIVE questions.                                                                          (5 ´ 8 = 40 marks)

 

  1. Show that the sample variance is a biased estimator of the population variance s2. Suggest an UBE of s2.
  2. If Tn is a consistent estimator of j (q), show that there exists infinitely many consistent estimators of j (q).
  3. State and derive Cramer – Rao inequality.
  4. Show that UMVUE is essentially unique.
  5. Give an example to show that bounded completeness does not imply completeness.
  6. Show that the sample mean is a complete sufficient statistic in the case of P (q), q > 0.
  7. State and establish Lehmann – Scheffe theorem.
  8. State and prove ‘Invariance property’ of MLE.

 

SECTION – C

 

Answer any TWO questions                                                                          (2 ´ 20 = 40 marks)

 

 

  1. a) Let f (x;q) =

 

  • ,  otherwise

 

Based on a random sample of size n, suggest an UBE of

  1. q when s is known and
  2. s when q is known          (5+5)

 

  1. b) Obtain CRLB for estimating q, in the case of f(x; q) = x Î R, q Î R,

based on a random sample of size n.                                                                          (10)

  1. a) State and establish factorization theorem in the discrete case.

 

  1. b) Obtain a sufficient statistic for q = (m, s2) based on a random sample of size n from

N (m, s2), m Î R, s2 > 0.                                                                                     (12 + 8)

 

  1. a) Explain the method of maximum likelihood.

 

  1. b) Let X1, X2, …, Xn denote a random sample of size n from N (m, s2). Obtain MLE of

q  = (m, s2).                                                                                                           (5 + 15)

 

  1. a) Describe the method of minimum chi-square and the method of modified minimum

chi-square.

 

  1. b) Describe the linear model in the Gauss – Markov set – up.

 

  1. c) Let y = Ab + e be the linear model where E (e) = 0. Show that a necessary and

sufficient condition for the linear function  b of the parameters to be linearly

estimable is that rank (A) = rank .                                                           (10 + 6 + 4)

 

 

Go To Main page

 

 

 

Loyola College B.Sc. Statistics Nov 2006 Estimation Theory Question Paper PDF Download

            LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034                    B.Sc. DEGREE EXAMINATION – STATISTICS

AB 13

FIFTH SEMESTER – NOV 2006

ST 5500 – ESTIMATION THEORY

(Also equivalent to STA 505)

 

 

Date & Time : 25-10-2006/9.00-12.00         Dept. No.                                                       Max. : 100 Marks

 

 

 

Part A

Answer all the questions.                                                                        10 X 2 = 20

 

 

  1. Define bias of an estimator in estimating the parametric function.
  2. Explain efficiency of an estimator.
  3. Explain Uniformly Minimum Variance Unbiased Estimator (UMVUE).
  4. What is Cramer-Rao lower bound?
  5. Define bounded completeness.
  6. State Bhattacharyya inequality.
  7. Let X1, X2 denote a random sample of size 2 from B (1, θ), 0 < θ < 1. Show that    X1 + 3X2 is sufficient for θ.
  8. Describe sufficient statistic.
  9. Explain Bayes estimation.
  10. What is BLUE?
Part B

Answer any five questions.                                                                           5 X 8 = 40

 

  1. Let X1, X2, … , Xn denote a random sample of size n from B(1, p), 0 < p < 1. Suggest an unbiased estimator of (i)  p and  (ii).  p (1- p).
  2. If Tn asymptotically unbiased with variance approaching zero as n ® ¥ then show that Tn is consistent.
  3. State and establish Factorization Theorem in the discrete case.
  4. Show that the family of Bernoulli distributions { B(1,p), 0< p < 1} is complete.
  5. State and establish Lehmann-Scheffe theorem.
  6. Let X1, X2, … , Xn denote a random sample of size n from a distribution with p.d.f.                                  e– (x-q), x ³q, q Î Â

f(x; q) =     0      , otherwise.

 

Obtain UMVUE of q.

  1. Give an example where MLE is not unique.
  2. Explain Gauss-Markov model.

 

 

 

 

 

Part C
Answer any two questions.                                                                         2 X 20 = 40

 

  1. a). Let X1, X2, … ,.Xn  denote a random sample of size n from P(q), q >0. Suggest

an unbiased estimator of i)  q        ii) 5q + 7.

b). If Tn is consistent estimator for and g is continuous then show that g(Tn)

is consistent for g().                                                                        (10 +10)

 

  1. a). Show that UMVUE is essentially unique.

b). Obtain CRLB for estimating q in case of

1

f(x ; q) =                                   , – µ < x < µ and – µ < q < µ,

  • [1 + (x – q)2 ]

 

based on a random sample of size n.                                                            (10 +10)

  1. a). State and establish Chapman – Robbins inequality.

b). Describe the method of moments with an illustration.                            (12 + 8)

  1. a). Let X1, X2, … , Xn denote a random sample of size n from N (m, s2). Obtain

MLE of q = (m, s2).

b). Illustrate the method of moments with the help of G (a, p).                   (12 + 8)

 

Go To Main page

 

Loyola College B.Sc. Statistics April 2007 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

B.Sc.

AC 16

DEGREE EXAMINATION –STATISTICS

FIFTH SEMESTER – APRIL 2007

ST 5500ESTIMATION THEORY

 

 

Date & Time: 27/04/2007 / 1:00 – 4:00          Dept. No.                                                     Max. : 100 Marks

 

 

Part A

Answer all the questions.                                                                  10 X 2 = 20

  1. State the problem of point estimation.
  2. Define asymptotically unbiased estimator.
  3. Define a consistent estimator and give an example.
  4. Explain minimum variance bound estimator.
  5. What is Fisher information?
  6. Write a note on bounded completeness.
  7. Examine if { N (0, σ2), σ2 > 0 } is complete.
  8. Let X1, X2 denote a random sample of size 2 from P(q),q > 0. Show that X1 + 2X2 is not sufficient for q.
  9. State Chapman – Robbins inequality.
  10. Explain linear estimation.

 

Part B

Answer any five  questions.                                                              5 X 8 = 40

 

 

  1. Let X1, X2, … ,Xn denote a random sample of size n from B(1,q), 0<q<1. Show that  is an unbiased estimator of q2, where T = .
  2. If Tn  is consistent for Y(q) and g is continuous, show that g(Tn ) is consistent for g{Y(q)}.
  3. State and establish Cramer – Rao inequality.
  4. Show that the family of binomial distributions { B (n, p), 0 < p < 1, n fixed } is complete.
  5. State and establish Rao – Blackwell theorem.
  6. Let X1, X2, … , Xn denote a random sample of size n from U (0, q), q > 0.

Obtain the UMVUE of q.

  1. Give an example for each of the following
  1. MLE, which is not unbiased.
  2. MLE, which is not sufficient.
  1. Describe the method of minimum chi-square and the method of modified minimum

chi-square.

Part C
Answer any two questions.                                                                     2 X 20 = 40

 

  1. a). Show that the sample variance is a biased estimator of the population variance.

Suggest an unbiased estimator of s2.

b). If Tn is asymptotically unbiased with variance approaching zero as n

approaches infinity then show that Tn is consistent.                              (10 + 10)

 

 

  1. a). Let X1, X2, … , Xn denote a random sample of size n from U (q-1, q +1).

Show that the mid – range U = is an unbiased estimator of q.

 

b). Obtain the estimator of p based on a random sample of size n from

B(1, p), 0 < p < 1by the method of

i). Minimum chi-square

ii). Modified minimum chi-square.                                                    (12 + 8)

 

  1. a). Give an example to show that bounded completeness does not imply completeness.

 

b). Stat and prove invariance property of MLE.                                           (10 +10)

  1. a). State and establish Bhattacharyya inequality.

b). Write short notes on Bayes estimation.                                                    (12 + 8)

 

 

 

Go To Main page

Loyola College B.Sc. Statistics Nov 2007 Estimation Theory Question Paper PDF Download

Go To Main page

Loyola College B.Sc. Statistics April 2008 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

           B.Sc. DEGREE EXAMINATION – STATISTICS

NO 24

FIFTH SEMESTER – APRIL 2008

ST 5500 – ESTIMATION THEORY

 

 

 

Date : 28-04-08                  Dept. No.                                        Max. : 100 Marks

Time : 1:00 – 4:00

PART-A

 

Answer ALL the questions:                                                             (10×2=20)

 

  1. Define ‘bias’ of an estimator.
  2. When do you say an estimator is consistent?
  3. Define a sufficient statistic.
  4. What do you mean by bounded completeness?
  5. Describe method of moments in estimation.
  6. State invariance property of maximum likelihood estimator.
  7. Define Loss function and give an example.
  8. Explain ‘prior distribution’ and ‘posterior distribution’.
  9. Explain least square estimation.
  10. Mention any two properties of least squares estimator.

 

PART-B

Answer any FIVE questions:                                                           (5×8=40)

 

  1. If Tn is asymptotically unbiased with variance approaching 0 as , then show that Tn is consistent.
  2. Show that is an unbiased estimate of , based on a random sample drawn from .
  3. Let be a random sample of size n from  population. Examine if is complete.
  4. State and prove RAo-Blackwell theorem.
  5. Estimate by the method of moments in the case of Pearson’s Type III distribution with p.d.f .
  6. State and establish Bhattacharya inequality.
  7. Describe the method of modified minimum Chi square.
  8. Write a note on Baye’s estimation.

 

PART-C

Answer any TWO questions:                                                           (10×2=20)

  1. a) and is a random sample of size 3 from a population with mean value and variance .  are the estimators used to estimate mean value , where  and .
  1. Are T1 and T2 unbiased estimators?
  2. Find the value of such that T3 a consistent estimator?
  • With this value of is T3 a consistent estimator?
  1. Which is the best estimator?
  2. b) If are random observations on a Bernoulli variate X taking the value 1 with probability p and the value 0 with probability (1-p), show that is a consistent estimator of p(1-p).
  1. a) State and Prove cramer-Rao inequality.
  1. b) Given the probability density function

Show that the Cramer-Rao-lower bound of variance of an unbiased estimator of  is 2/n, where n is the size of the random sample from this distribution. [12+8]

  1. a) State and prove Lehmann – Scheffe theorem
  1. b) Obtain MLE of in based on an independent sample of size n. Examine whether this estimate is sufficient for .                   [12+8]
  1. a) Show that a necessary and sufficient condition for the linear parametric function to be linearly estimable is that

ank (A) = rank

where  and

  1. b) Describe Gauss – Markov model [12+8]

 

 

Go To Main page

Loyola College B.Sc. Statistics Nov 2008 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

B.Sc. DEGREE EXAMINATION – STATISTICS

BA 10

 

FIFTH SEMESTER – November 2008

ST 5500 – ESTIMATION THEORY

 

 

 

Date : 03-11-08                     Dept. No.                                        Max. : 100 Marks

Time : 9:00 – 12:00

 

SECTION – A

Answer ALL questions.                                                                              10 X 2 = 20

  1. Define Consistent estimator. Give an example.
  2. Give two examples for unbiased estimator.
  3. Define UMVUE.
  4. Describe the concept of Bounded completeness.
  5. Describe the Method of Minimum Chi-square estimation.
  6. State an MLE ofλ based on a random sample of size n form a Poisson Distribution

with parameter λ.

  1. Describe the concept of Baye’s estimation.
  2. Define Loss function.
  3. Describe the Method of Least Squares.
  4. Define BLUE.

SECTION – B

Answer Any FIVE questions.                                                                                       5 X 8 = 40

  1. Derive an unbiased estimator of , based on a random sample of size n form B (1,).
  2. Let { Tn = 1, 2,3, ….. } be a sequence of estimators such that

and  .Then show that Tn is

consistent for .

  1. If is a random sample from P (λ),, then show that

is a sufficient statistic for .

  1. Show that the family of Binomial distributions {B (1,).0 < θ < 1} is complete.
  2. Describe estimation of parameters by “Method of Maximum Likelihood”
  3. Describe any two properties of MLE, with examples.
  4. Explain prior and posterior distributions.
  5. Derive the least square estimator of β1 under the model Y = β0 + β1X+

 

SECTION – C

Answer any TWO questions.                                                                          2 X 20 = 40

  1. a. State and prove Chapman-Robbin’s inequality. [12]

b Using Factorization theorem derive a sufficient statistic for μ based on a random

sample of size n from N (μ, 1), MϵR                                                                      [8]

 

  1. a. State and prove a necessary and sufficient condition for an unbiased estimator to be a

UMVUE.                                                                                                                      [15]

  1. If T1 and T2 are UMVUES of y1(q) and  y2(q) respectively, then show that T1+T2 is the UMVUE of  y1(q) and  y2(q).                                                                                                                [5]

 

 

21 a.  Explain the concept of estimation by the method of modified minimum chi-square.  [8]

  1. Let be a random sample from a distribution with density function

f (x, θ) =                                                            [12]

Find the maximum likelihood estimator of  and examine whether it is consistent.

 

  1. Explain: i) Risk function.              ii) Method of Moments

 

iii) Completeness     iv). Gauss –Markov model                           [ 4 x 5 ]

 

Go To Main Page

Loyola College B.Sc. Statistics April 2009 Estimation Theory Question Paper PDF Download

           LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

B.Sc. DEGREE EXAMINATION – STATISTICS

YB 24

FIFTH SEMESTER – April 2009

ST 5500 – ESTIMATION THEORY

 

 

 

Date & Time: 16/04/2009 / 9:00 – 12:00      Dept. No.                                                Max. : 100 Marks

 

 

PART A                                Answer all the questions                             [10×2=20]

 

  1. Define the efficiency of an unbiased estimator. Give an example of a most

efficient estimator.

  1. State the invariance property of consistent estimators..
  2. State Factorization Theorem on sufficient statistic.
  3. Show that exponential distribution with parameter λ is complete.
  4. List out any two small sample properties of ML estimator.
  5. Find ML estimator of θ in random sampling of size n from a population whose

pdf is  f(x, θ)   =          e – (x – θ),  for x > θ

=   0                      otherwise.

  1. Define Loss Function. Is it a random variable? Justify.
  2. When do we say that a statistic is Bayesian sufficient? Give an example.

9          Write down the normal equations of a simple linear regression model.

  1. Mention the uses of Gauss-Markoff Model.

PART B                                            Answer any FIVE questions                       [5×8=40]

  1. Show that the sample variance is a consistence estimator of the population

variance.

  1. If X follows Binomial distribution with parameters n and p. Examine the

asymptotic unbiasedness of   T =    .

  1. States and Prove Rao Blackwell Theorem.
  2. Let (X1, X2, X3, …Xn) is a random sample from Poisson population with parameter

λ. Use Lehman Scheffe Theorem to obtain a UMVUE of λ

  1. Obtain the moment estimators of the parameters of a two-parameter gamma

distribution.

 

  1. Illustrate the invariance property of ML estimator through an example
  2. Explain the method of modified Chi-square estimation.
  3. State and prove Gauss Markoff model on BLUE

 

PART C                                            Answer any TWO questions                       [2×20=40]

  1. (a) State and prove Cramer Rao inequality in one parameter regular case. When

does the equality hold good?

(b) Establish a sufficient condition for a biased estimator to become a consistent

estimator.

  1. (a) State and prove Lehman Scheffe theorem on UMVUE

(b) Obtain a joint sufficient statistic of the parameters of the bi-variate normal

population.

  1. (a) Derive the moment estimators of the parameters of two parameter uniform

distribution.

(b)            Derive the ML estimators of the parameters of normal distribution by solving

simultaneous equations.

  1. (a) Establish a necessary and sufficient condition for a linear parametric function to

be estimable.

(b) Let (X1, X2, X3, …Xn) is a random sample of size n from Bernoulli population.

Obtain the Bayesian estimator of the parameter by taking a suitable prior

distribution..

 

 

Go To Main Page

 

Loyola College B.Sc. Statistics April 2009 Estimation Theory Question Paper PDF Download

       LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

B.Sc. DEGREE EXAMINATION – STATISTICS

YB 25

FIFTH SEMESTER – April 2009

ST 5501 – TESTING OF HYPOTHESIS

 

 

 

Date & Time: 17/04/2009 / 9:00 – 12:00       Dept. No.                                                       Max. : 100 Marks

 

 

SECTION – A

 

Answer ALL questions.                                                                              10 X 2 = 20

 

  1. Define Best Critical Region.
  2. What is power of a test?
  3. Define Monotone Likelihood Ratio property.
  4. Define a UMP level α test.
  5. Describe the stopping rule in SPRT.
  6. State any two properties of likelihood ratio tests.
  7. Write the 95 % confidence interval for population proportion based on a random sample of size n. (n > 30)
  8. Describe the assumptions of t – test for testing equality of means of two independent
  9. State any two assumptions of non-parametric tests.
  10. Describe a test based on F- distribution.

 

 

 SECTION – B

Answer Any FIVE questions.                                                                                       5 X 8 = 40

 

  1. Let X have a Poisson distribution with λ {2, 4 }. To test the null

hypothesis H0: λ = 2 against the alternative simple hypothesis H1: λ = 4, let the

critical region be {X1  X1 ≤ 3}, where X1 is the random sample of size one. Find

the power of the test.

 

  1. Based on a random sample of size n (n   30), construct a 95 % confidence interval for the population mean.

 

  1. If  is a random sample from B (1, ),   (,1 ) , derive the

uniformly most powerful test for testing H0:  =  against H1:  >

 

  1. Let be a random sample from Binomial distribution with parameter. Show that the distribution has a monotone likelihood ratio in the statistic Y =

 

  1. Describe the procedure of Sequential Probability Ratio Test.

 

  1. Based on a random sample of size n from B(, 0 <  , derive the SPRT

for testing H0:  against the alternative  hypothesis H1:  at level =0.05.

 

  1. Differentiate parametric and Non-Parametric testing procedures.

 

  1. Explain Kolmogorov- Smirnov one sample test.

 

SECTION – C

 

Answer any TWO questions.                                                                    2 X 20 = 40

 

  1. a. State and prove Neyman-Pearson theorem.       [10]

                                                                                                                             

  1. Based on a random sample of size n from a distribution with pdf

f(x, ) =        0 < x < 1

  • otherwise

find the best critical region for testing null hypothesis H0: = 2 against the

alternative simple hypothesis H1:  = 3.                                                         [10]

 

  1. Based on a random sample of size n from U(0,θ), derive the likelihood ratio test

for testing  H0:  against the alternative hypothesis H1:  .

 

  1. a. Describe the procedure of testing H0 : based on a random sample of

size n, using Wilcoxon’s statistic.                                                                    [10]

  1. In SPRT, under standard notations prove that and

 

  1. Explain: i) Sign test for location                          ii) Level of significance

 

iii) Test of equality of two variances      iv) Randomized test.      [ 4 x 5 ]

 

 

 

Go To Main Page

Loyola College B.Sc. Statistics Nov 2010 Estimation Theory Question Paper PDF Download

LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

   B.Sc. DEGREE EXAMINATION – STATISTICS

FIFTH SEMESTER – NOVEMBER 2010

ST 5504/ST 5500 – ESTIMATION THEORY

 

 

 

Date : 29-10-10                     Dept. No.                                        Max. : 100 Marks

Time : 9:00 – 12:00

PART – A

Answer ALL the questions                                                                                                              [10×2=20]

 

  1. Define an unbiased estimator and give an example.
  2. Define the consistency of an estimator.
  3. When is an estimator called most efficient estimator?   
  4. Define UMVUE.
  5. State any two regularity conditions.
  6. State the invariance property of ML estimator.
  7. Explain ‘posterior’ distribution.
  8. Define Bayes estimator.
  9. State the normal equations associated with the simple regression model.
  10. Define Best Linear unbiased estimator.

PART – B

 Answer any FIVE questions                                                                                                            [5×8=40]

  1. If X1,X2, . . . Xn are random observations on a Bernoulli variate X taking the value 1 with

      probability p and the value 0 with probability (1-p), show that: is a consistent

      estimator of p(1-p).

  1. If X1,X2, . . . Xn are random observations of Poisson population with parameter λ.

      Obtain UMVUE for λ using Lehman Scheffe Theorem.                                                                                    

  1. State and prove Factorization theorem [Neymann].
  2. Write short notes about the method of minimum Chi-square estimation.
  3. Estimate α and p for the following distribution by the methods of moments:

        .

  1. Let X1,X2, . . . Xn be a random sample of size n from a uniform population with p.d.f:

                         f(x; Ө) = 1; Ө-1/2 ≤ x ≤ Ө+1/2, -∞< Ө <∞.

       Obtain M.L.E for Ө.

  1. State and prove Gauss Markoff Theorem.
  2. Explain the method of Least Squares with an illustration.

 

 

PART – C

Answer any TWO questions                                                                                                     [2×20=40]

  1. a) Obtain Cramer Rao Lower bound for an unbiased estimator of if f(x,θ) =

            

  1. b) Establish Chapman-Robbins Inequality.
  2. a) State and prove Rao Blackwell theorem and mention it’s importance.
  3. b) Let X1,X2, . . . Xn be an iid {f Ө, Ө Є Θ} and each Xi, i=1,2,3,4 has mean μ and

          variance σ2. Find the minimum variance unbiased estimator among the following.

                                    T1(x) = X1;

T2(x) = (X1+X2)/2;

T3(x) = (X1+2X2+3X3)/6;

T4(x) =(X1+X2+X3+X4)/4

  1. a) Show that UMVUE is unique.
  2. b) State and prove a sufficient condition for consistency of an estimator.
  3. a) Establish a necessary and sufficient condition for a linear parametric function to be

          Estimable.

  1. b) In Sampling from {b(1, Ө), 0<Ө<1}., Obtain Bayesian estimator of Ө taking a suitable prior

          distribution.

 

 

Go To Main Page

Loyola College B.Sc. Statistics April 2011 Estimation Theory Question Paper PDF Download

 

 

 

 

 

ST 5504                       / ST 5500         ESTIMATION THOERY

 

Section A

Answer all the questions                                                                            10×2=20  

 

 

1          Define an Unbiased Estimator.

  1. Define consistency of an estimator. Mention any one of its properties.
  2. What is the importance of factorization theorem?
  3. Define a Sufficient Statistic.
  4. Briefly explain the method of moment estimation.
  5. Explain prior distributions and posterior distributions with reference to Bayesian Estimation.
  6. List out any two properties of Maximum Likelihood estimators.
  7. Define Squared Error Loss function.
  8. Define Best Linear unbiased Estimation. Give an example.
  9. Write down the normal equations of a simple linear regression model.

 

Section B

Answer any five questions                                                                    5×8=40 

 

  1. Let X1,X2, ..Xn be a random sample taken from a normal population with unknown mean and unknown variance. Examine the unbiasedness and consistency of T(x) = (Xi – )2
  2. State and prove a sufficient condition for an estimator to be consistent.

(P.T.O)

13 Let X1,X2, ..Xn be a random sample taken from a population whose probability density function is

f(x, ) =  exp{–  },  >0,  x>0

Use Factorization Theorem to obtain a sufficient statistic for .

 

  1. Show that Poisson distribution is complete.

 

  1. Explain the method of minimum chi-square estimation.

 

  1. Obtain the Maximum Likelihood estimators of the parameters of a normal distribution.

 

  1. X1,X2, X3,and X4 are four independent Poisson random variables with mean . Define T1(x) =  {X1 +3X3}

                          T2(x) ={X1 +2X2+3X3}

                          T3(x) ={X1+X2 +X3+X4}

Examine their unbiasedness and compute their variances. Which one is the best estimator among the three? Find the efficiency of T1(x) and T2(x) with respect to T3(x).

 

  1. Explain in detail Gauss-Markov model.

 

 

 

(P.T.O)

 

 

Section C

Answer any two questions                                                               2×20=40 

 

19        (a)       State and prove Chapman-Robbins inequality. Bring out its importance.

(b) Let X1,X2, ..Xn be a random sample taken from a normal population with unknown mean and unit variance. Obtain Cramer-Rao Lower bound for an unbiased estimator of the mean.                                                 (12 + 8)

 

  1. (a) Show that UMVUE for a parametric function is unique.

(b) State and Prove Rao-Blackwell Theorem.                            (10 + 10)

 

  1. (a) Obtain the moment estimators for Uniform distribution U(a, b).

(b) Show that Maximum Likelihood Estimator need not be unique with an example. Also, show that when MLE is unique, it is a function of the sufficient statistic.                                                                                       (10 + 10)

 

  1. (a) Let X1,X2, ..Xn be a random sample from Bernoulli distribution b(1, q). Obtain the Bayes estimator for q by taking a suitable prior.

(b) State and prove a necessary and sufficient condition for a parametric function to be linearly estimable.                                                     (10 + 10)

 

Go To Main page

© Copyright Entrance India - Engineering and Medical Entrance Exams in India | Website Maintained by Firewall Firm - IT Monteur