Showing estimator is unbiased

Questions asking to prove or show that a given estimator is unbiased by calculating E(T) = θ, often involving linear combinations of sample observations.

5 questions · Standard +0.7

5.03a Continuous random variables: pdf and cdf5.03b Solve problems: using pdf5.05b Unbiased estimates: of population mean and variance
Sort by: Default | Easiest first | Hardest first
OCR S4 2018 June Q7
15 marks Challenging +1.2
7 Two independent observations \(X _ { 1 }\) and \(X _ { 2 }\) are made of a continuous random variable with probability density function $$f ( x ) = \begin{cases} \frac { 1 } { \theta } & 0 \leqslant x \leqslant \theta \\ 0 & \text { otherwise } \end{cases}$$ where \(\theta\) is a parameter whose value is to be estimated.
  1. Find \(\mathrm { E } ( X )\).
  2. Show that \(S _ { 1 } = X _ { 1 } + X _ { 2 }\) is an unbiased estimator of \(\theta\). \(L\) is the larger of \(X _ { 1 }\) and \(X _ { 2 }\), or their common value if they are equal.
  3. Show that the probability density function of \(L\) is \(\frac { 2 l } { \theta ^ { 2 } }\) for \(0 \leqslant l \leqslant \theta\).
  4. Find \(\mathrm { E } ( L )\).
  5. Find an unbiased estimator \(S _ { 2 }\) of \(\theta\), based on \(L\).
  6. Determine which of the two estimators \(S _ { 1 }\) and \(S _ { 2 }\) is the more efficient.
Edexcel S4 2018 June Q6
19 marks Challenging +1.2
  1. The continuous random variable \(X\) has probability density function \(\mathrm { f } ( x )\)
$$f ( x ) = \left\{ \begin{array} { c c } \frac { x } { 2 \theta ^ { 2 } } & 0 \leqslant x \leqslant 2 \theta \\ 0 & \text { otherwise } \end{array} \right.$$ where \(\theta\) is a constant.
  1. Use integration to show that \(\mathrm { E } \left( X ^ { N } \right) = \frac { 2 ^ { N + 1 } } { N + 2 } \theta ^ { N }\)
  2. Hence
    1. write down an expression for \(\mathrm { E } ( X )\) in terms of \(\theta\)
    2. find \(\operatorname { Var } ( X )\) in terms of \(\theta\) A random sample \(X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\) where \(n \geqslant 2\) is taken to estimate the value of \(\theta\) The random variable \(S _ { 1 } = q \bar { X }\) is an unbiased estimator of \(\theta\)
  3. Write down the value of \(q\) and show that \(S _ { 1 }\) is a consistent estimator of \(\theta\) The continuous random variable \(Y\) is independent of \(X\) and is uniformly distributed over the interval \(\left[ 0 , \frac { 2 \theta } { 3 } \right]\), where \(\theta\) is the same unknown constant as in \(\mathrm { f } ( x )\). The random variable \(S _ { 2 } = a X + b Y\) is an unbiased estimator of \(\theta\) and is based on one observation of \(X\) and one observation of \(Y\).
  4. Find the value of \(a\) and the value of \(b\) for which \(S _ { 2 }\) has minimum variance.
  5. Show that the minimum variance of \(S _ { 2 }\) is \(\frac { \theta ^ { 2 } } { 11 }\)
  6. Explain which of \(S _ { 1 }\) or \(S _ { 2 }\) is the better estimator for \(\theta\)
Edexcel S4 Q2
11 marks Moderate -0.3
The value of orders, in £, made to a firm over the internet has distribution N(\(\mu, \sigma^2\)). A random sample of \(n\) orders is taken and \(\bar{X}\) denotes the sample mean.
  1. Write down the mean and variance of \(\bar{X}\) in terms of \(\mu\) and \(\sigma^2\). [2]
A second sample of \(m\) orders is taken and \(\bar{Y}\) denotes the mean of this sample. An estimator of the population mean is given by $$U = \frac{n\bar{X} + m\bar{Y}}{n + m}$$
  1. [(b)] Show that \(U\) is an unbiased estimator for \(\mu\). [3]
  2. Show that the variance of \(U\) is \(\frac{\sigma^2}{n + m}\). [4]
  3. State which of \(\bar{X}\) or \(U\) is a better estimator for \(\mu\). Give a reason for your answer. [2]
WJEC Further Unit 5 2019 June Q2
6 marks Standard +0.3
The continuous random variable \(X\) is uniformly distributed over the interval \((\theta - 1, \theta + 5)\), where \(\theta\) is an unknown constant.
  1. Find the mean and the variance of \(X\). [2]
  2. Let \(\overline{X}\) denote the mean of a random sample of 9 observations of \(X\). Find, in terms of \(\overline{X}\), an unbiased estimator for \(\theta\) and determine its standard error. [4]
WJEC Further Unit 5 2019 June Q8
18 marks Challenging +1.2
The random variable \(X\) has probability density function $$f(x) = 1 + \frac{3\lambda x}{2} \quad \text{for } -\frac{1}{2} \leqslant x \leqslant \frac{1}{2},$$ $$f(x) = 0 \quad \text{otherwise,}$$ where \(\lambda\) is an unknown parameter such that \(-1 \leqslant \lambda \leqslant 1\).
    1. Find E\((X)\) in terms of \(\lambda\).
    2. Show that \(\text{Var}(X) = \frac{16 - 3\lambda^2}{192}\). [6]
  1. Show that P\((X > 0) = \frac{8 + 3\lambda}{16}\). [2]
In order to estimate \(\lambda\), \(n\) independent observations of \(X\) are made. The number of positive observations obtained is denoted by \(Y\) and the sample mean is denoted by \(\overline{X}\).
    1. Identify the distribution of \(Y\).
    2. Show that \(T_1\) is an unbiased estimator for \(\lambda\), where $$T_1 = \frac{16Y}{3n} - \frac{8}{3}.$$ [4]
    1. Show that \(\text{Var}(T_1) = \frac{64 - 9\lambda^2}{9n}\).
    2. Given that \(T_2\) is also an unbiased estimator for \(\lambda\), where $$T_2 = 8\overline{X},$$ find an expression for Var\((T_2)\) in terms of \(\lambda\) and \(n\).
    3. Hence, giving a reason, determine which is the better estimator, \(T_1\) or \(T_2\). [6]