Show unbiased estimator

A question is this type if and only if it asks to prove that a given estimator is unbiased by showing E(estimator) equals the parameter.

6 questions

OCR S4 2007 June Q7
7 The continuous random variable \(X\) has a uniform distribution over the interval \([ 0 , \theta ]\) so that the probability density function is given by $$f ( x ) = \begin{cases} \frac { 1 } { \theta } & 0 \leqslant x \leqslant \theta
0 & \text { otherwise } \end{cases}$$ where \(\theta\) is a positive constant. A sample of \(n\) independent observations of \(X\) is taken and the sample mean is denoted by \(\bar { X }\).
  1. The estimator \(T _ { 1 }\) is defined by \(T _ { 1 } = 2 \bar { X }\). Show that \(T _ { 1 }\) is an unbiased estimator of \(\theta\). It is given that the probability density function of the largest value, \(U\), in the sample is $$g ( u ) = \begin{cases} \frac { n u ^ { n - 1 } } { \theta ^ { n } } & 0 \leqslant u \leqslant \theta
    0 & \text { otherwise } \end{cases}$$
  2. Find \(\mathrm { E } ( U )\) and show that \(\operatorname { Var } ( U ) = \frac { n \theta ^ { 2 } } { ( n + 1 ) ^ { 2 } ( n + 2 ) }\).
  3. The estimator \(T _ { 2 }\) is defined by \(T _ { 2 } = \frac { n + 1 } { n } U\). Given that \(T _ { 2 }\) is also an unbiased estimator of \(\theta\), show that \(T _ { 2 }\) is a more efficient estimator than \(T _ { 1 }\) for \(n > 1\).
OCR MEI S4 2007 June Q1
1 The random variable \(X\) has the continuous uniform distribution with probability density function $$\mathrm { f } ( x ) = \frac { 1 } { \theta } , \quad 0 \leqslant x \leqslant \theta$$ where \(\theta ( \theta > 0 )\) is an unknown parameter.
A random sample of \(n\) observations from \(X\) is denoted by \(X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\), with sample mean \(\bar { X } = \frac { 1 } { n } \sum _ { i = 1 } ^ { n } X _ { i }\).
  1. Show that \(2 \bar { X }\) is an unbiased estimator of \(\theta\).
  2. Evaluate \(2 \bar { X }\) for a case where, with \(n = 5\), the observed values of the random sample are \(0.4,0.2\), 1.0, 0.1, 0.6. Hence comment on a disadvantage of \(2 \bar { X }\) as an estimator of \(\theta\). For a general random sample of size \(n\), let \(Y\) represent the sample maximum, \(Y = \max \left( X _ { 1 } , X _ { 2 } , \ldots , X _ { n } \right)\). You are given that the probability density function of \(Y\) is $$g ( y ) = \frac { n y ^ { n - 1 } } { \theta ^ { n } } , \quad 0 \leqslant y \leqslant \theta$$
  3. An estimator \(k Y\) is to be used to estimate \(\theta\), where \(k\) is a constant to be chosen. Show that the mean square error of \(k Y\) is $$k ^ { 2 } \mathrm { E } \left( Y ^ { 2 } \right) - 2 k \theta \mathrm { E } ( Y ) + \theta ^ { 2 }$$ and hence find the value of \(k\) for which the mean square error is minimised.
  4. Comment on whether \(k Y\) with the value of \(k\) found in part (iii) suffers from the disadvantage identified in part (ii).
OCR S4 2017 June Q6
6 The continuous random variable \(Z\) has probability density function $$f ( z ) = \left\{ \begin{array} { c c } \frac { 4 z ^ { 3 } } { k ^ { 4 } } & 0 \leqslant z \leqslant k
0 & \text { otherwise } \end{array} \right.$$ where \(k\) is a parameter whose value is to be estimated.
  1. Show that \(\frac { 5 Z } { 4 }\) is an unbiased estimator of \(k\).
  2. Find the variance of \(\frac { 5 Z } { 4 }\). The parameter \(k\) can also be estimated by making observations of a random variable \(X\) which has mean \(\frac { 1 } { 2 } k\) and variance \(\frac { 1 } { 12 } k ^ { 2 }\). Let \(Y = X _ { 1 } + X _ { 2 } + X _ { 3 }\) where \(X _ { 1 } , X _ { 2 }\) and \(X _ { 3 }\) are independent observations of \(X\).
  3. \(c Y\) is also an unbiased estimator of \(k\). Find the value of \(c\).
  4. For the value of \(c\) found in part (iii), determine which of \(\frac { 5 Z } { 4 }\) and \(c Y\) is the more efficient estimator of \(k\).
OCR S4 2009 June Q6
6 The continuous random variable \(X\) has probability density function given by $$\mathrm { f } ( x ) = \begin{cases} 0 & x < a ,
\mathrm { e } ^ { - ( x - a ) } & x \geqslant a , \end{cases}$$ where \(a\) is a constant. \(X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\) are \(n\) independent observations of \(X\), where \(n \geqslant 4\).
  1. Show that \(\mathrm { E } ( X ) = a + 1\).
    \(T _ { 1 }\) and \(T _ { 2 }\) are proposed estimators of \(a\), where $$T _ { 1 } = X _ { 1 } + 2 X _ { 2 } - X _ { 3 } - X _ { 4 } - 1 \quad \text { and } \quad T _ { 2 } = \frac { X _ { 1 } + X _ { 2 } } { 4 } + \frac { X _ { 3 } + X _ { 4 } + \ldots + X _ { n } } { 2 ( n - 2 ) } - 1 .$$
  2. Show that \(T _ { 1 }\) and \(T _ { 2 }\) are unbiased estimators of \(a\).
  3. Determine which is the more efficient estimator.
  4. Suggest another unbiased estimator of \(a\) using all of the \(n\) observations.
WJEC Further Unit 5 2023 June Q2
2. The random variables \(X\) and \(Y\) are independent, with \(X\) having mean \(\mu\) and variance \(\sigma ^ { 2 }\), and \(Y\) having mean \(\mu\) and variance \(k \sigma ^ { 2 }\), where \(k\) is a positive constant. Let \(\bar { X }\) denote the mean of a random sample of 20 observations of \(X\), and let \(\bar { Y }\) denote the mean of a random sample of 25 observations of \(Y\).
  1. Given that \(T _ { 1 } = \frac { 3 \bar { X } + 7 \bar { Y } } { 10 }\), show that \(T _ { 1 }\) is an unbiased estimator for \(\mu\).
  2. Given that \(T _ { 2 } = \frac { \bar { X } + a ^ { 2 } \bar { Y } } { 1 + a } , a > 0\), and \(T _ { 2 }\) is an unbiased estimator for \(\mu\), prove that \(a = 1\).
  3. Find and simplify expressions for the variances of \(T _ { 1 }\) and \(T _ { 2 }\).
  4. Show that the value of \(k\) for which \(T _ { 1 }\) and \(T _ { 2 }\) are equally good estimators is \(\frac { 5 } { 6 }\).
  5. Given that \(T _ { 3 } = ( 1 - \lambda ) \bar { X } + \lambda \bar { Y }\), find an expression for \(\lambda\), in terms of \(k\), for which \(T _ { 3 }\) has the smallest possible variance.
WJEC Further Unit 5 2024 June Q5
5. The probability density function of the continuous random variable \(X\) is given by $$\begin{array} { l l } f ( x ) = \frac { 3 x ^ { 2 } } { \alpha ^ { 3 } } & \text { for } 0 \leqslant x \leqslant \alpha
f ( x ) = 0 & \text { otherwise. } \end{array}$$ \(\bar { X }\) is the mean of a random sample of \(n\) observations of \(X\).
    1. Show that \(U = \frac { 4 \bar { X } } { 3 }\) is an unbiased estimator for \(\alpha\).
    2. If \(\alpha\) is an integer, what is the smallest value of \(n\) that gives a rational value for the standard error of \(U\) ?
  1. \(\quad \bar { X } _ { 1 }\) and \(\bar { X } _ { 2 }\) are the means of independent random samples of \(X\), each of size \(n\). The estimator \(V = 4 \bar { X } _ { 1 } - \frac { 8 } { 3 } \bar { X } _ { 2 }\) is also an unbiased estimator for \(\alpha\).
    1. Show that \(\frac { \operatorname { Var } ( U ) } { \operatorname { Var } ( V ) } = \frac { 1 } { 13 }\).
    2. Hence state, with a reason, which of \(U\) or \(V\) is the better estimator.