Showing estimator is unbiased

Questions asking to prove or show that a given estimator is unbiased by calculating E(T) = θ, often involving linear combinations of sample observations.

2 questions

OCR S4 2018 June Q7
7 Two independent observations \(X _ { 1 }\) and \(X _ { 2 }\) are made of a continuous random variable with probability density function $$f ( x ) = \begin{cases} \frac { 1 } { \theta } & 0 \leqslant x \leqslant \theta
0 & \text { otherwise } \end{cases}$$ where \(\theta\) is a parameter whose value is to be estimated.
  1. Find \(\mathrm { E } ( X )\).
  2. Show that \(S _ { 1 } = X _ { 1 } + X _ { 2 }\) is an unbiased estimator of \(\theta\).
    \(L\) is the larger of \(X _ { 1 }\) and \(X _ { 2 }\), or their common value if they are equal.
  3. Show that the probability density function of \(L\) is \(\frac { 2 l } { \theta ^ { 2 } }\) for \(0 \leqslant l \leqslant \theta\).
  4. Find \(\mathrm { E } ( L )\).
  5. Find an unbiased estimator \(S _ { 2 }\) of \(\theta\), based on \(L\).
  6. Determine which of the two estimators \(S _ { 1 }\) and \(S _ { 2 }\) is the more efficient.
Edexcel S4 2018 June Q6
  1. The continuous random variable \(X\) has probability density function \(\mathrm { f } ( x )\)
$$f ( x ) = \left\{ \begin{array} { c c } \frac { x } { 2 \theta ^ { 2 } } & 0 \leqslant x \leqslant 2 \theta
0 & \text { otherwise } \end{array} \right.$$ where \(\theta\) is a constant.
  1. Use integration to show that \(\mathrm { E } \left( X ^ { N } \right) = \frac { 2 ^ { N + 1 } } { N + 2 } \theta ^ { N }\)
  2. Hence
    1. write down an expression for \(\mathrm { E } ( X )\) in terms of \(\theta\)
    2. find \(\operatorname { Var } ( X )\) in terms of \(\theta\) A random sample \(X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\) where \(n \geqslant 2\) is taken to estimate the value of \(\theta\) The random variable \(S _ { 1 } = q \bar { X }\) is an unbiased estimator of \(\theta\)
  3. Write down the value of \(q\) and show that \(S _ { 1 }\) is a consistent estimator of \(\theta\) The continuous random variable \(Y\) is independent of \(X\) and is uniformly distributed over the interval \(\left[ 0 , \frac { 2 \theta } { 3 } \right]\), where \(\theta\) is the same unknown constant as in \(\mathrm { f } ( x )\). The random variable \(S _ { 2 } = a X + b Y\) is an unbiased estimator of \(\theta\) and is based on one observation of \(X\) and one observation of \(Y\).
  4. Find the value of \(a\) and the value of \(b\) for which \(S _ { 2 }\) has minimum variance.
  5. Show that the minimum variance of \(S _ { 2 }\) is \(\frac { \theta ^ { 2 } } { 11 }\)
  6. Explain which of \(S _ { 1 }\) or \(S _ { 2 }\) is the better estimator for \(\theta\)