Edexcel FS2 2020 June — Question 4 7 marks

Exam BoardEdexcel
ModuleFS2 (Further Statistics 2)
Year2020
SessionJune
Marks7
PaperDownload PDF ↗
Mark schemeDownload PDF ↗
TopicProbability Generating Functions
TypeMoment generating function problems
DifficultyChallenging +1.2 This is a standard unbiased estimator question requiring straightforward application of E(X) and Var(X) for binomial distributions. Part (a) involves simple linearity of expectation, and part (b) requires comparing variances algebraically—a routine technique in FS2. The algebra is mechanical rather than requiring insight, though students must recognize that 'better' means lower variance.
Spec5.05b Unbiased estimates: of population mean and variance

4 A biased coin has a probability \(p\) of landing on heads, where \(0 < p < 1\) Simon spins the coin \(n\) times and the random variable \(X\) represents the number of heads. Taruni spins the coin \(m\) times, \(m \neq n\), and the random variable \(Y\) represents the number of heads. Simon and Taruni want to combine their results to find unbiased estimators of \(p\).
Simon proposes the estimator \(S = \frac { X + Y } { m + n }\) and Taruni proposes \(T = \frac { 1 } { 2 } \left[ \frac { X } { n } + \frac { Y } { m } \right]\)
  1. Show that both \(S\) and \(T\) are unbiased estimators of \(p\).
  2. Prove that, for all values of \(m\) and \(n , S\) is the better estimator.

Question 4:
Part (a):
AnswerMarks Guidance
Answer/WorkingMarks Guidance
\(X \sim B(n,p)\) so \(E(X)=np\) and \(Y \sim B(m,p)\) so \(E(Y)=mp\)M1 For selecting correct models for \(X\) and \(Y\)
\(E(S) = \frac{E(X+Y)}{n+m} = \frac{np+mp}{n+m} = p\) so \(S\) is unbiasedM1 For using these models to show that either \(S\) or \(T\) is unbiased
\(E(T) = \frac{1}{2}\left[\frac{E(X)}{n}+\frac{E(Y)}{m}\right] = \frac{1}{2}\left[\frac{np}{n}+\frac{mp}{m}\right] = \frac{1}{2}\times 2p = p\) so \(T\) is unbiasedA1cso For correctly showing that both are unbiased
Part (b):
AnswerMarks Guidance
Answer/WorkingMarks Guidance
\(\text{Var}(S) = \frac{np(1-p)+mp(1-p)}{(n+m)^2} = \frac{p(1-p)}{n+m}\)M1 For a correct attempt at \(\text{Var}(S)\) or \(\text{Var}(T)\) (need not be simplified)
\(\text{Var}(T) = \frac{1}{4}\left[\frac{np(1-p)}{n^2}+\frac{mp(1-p)}{m^2}\right] = \frac{p(1-p)(m+n)}{4nm}\)A1 For both correct variances
\(\text{Var}(S) < \text{Var}(T) \Rightarrow \frac{p(1-p)}{n+m} < \frac{p(1-p)(m+n)}{4mn} \Leftrightarrow 4mn < (n+m)^2\)M1 For a correct inequality in \(m\) and \(n\) and a first step to clear denominators
\(\Leftrightarrow 0 < m^2+2mn+n^2-4mn \Leftrightarrow 0 < (m-n)^2\)A1cso For a correct proof and conclusion
So \(S\) always has the smaller variance and is the better estimator
# Question 4:

## Part (a):
| Answer/Working | Marks | Guidance |
|---|---|---|
| $X \sim B(n,p)$ so $E(X)=np$ and $Y \sim B(m,p)$ so $E(Y)=mp$ | M1 | For selecting correct models for $X$ and $Y$ |
| $E(S) = \frac{E(X+Y)}{n+m} = \frac{np+mp}{n+m} = p$ so $S$ is unbiased | M1 | For using these models to show that either $S$ or $T$ is unbiased |
| $E(T) = \frac{1}{2}\left[\frac{E(X)}{n}+\frac{E(Y)}{m}\right] = \frac{1}{2}\left[\frac{np}{n}+\frac{mp}{m}\right] = \frac{1}{2}\times 2p = p$ so $T$ is unbiased | A1cso | For correctly showing that both are unbiased |

## Part (b):
| Answer/Working | Marks | Guidance |
|---|---|---|
| $\text{Var}(S) = \frac{np(1-p)+mp(1-p)}{(n+m)^2} = \frac{p(1-p)}{n+m}$ | M1 | For a correct attempt at $\text{Var}(S)$ or $\text{Var}(T)$ (need not be simplified) |
| $\text{Var}(T) = \frac{1}{4}\left[\frac{np(1-p)}{n^2}+\frac{mp(1-p)}{m^2}\right] = \frac{p(1-p)(m+n)}{4nm}$ | A1 | For both correct variances |
| $\text{Var}(S) < \text{Var}(T) \Rightarrow \frac{p(1-p)}{n+m} < \frac{p(1-p)(m+n)}{4mn} \Leftrightarrow 4mn < (n+m)^2$ | M1 | For a correct inequality in $m$ and $n$ and a first step to clear denominators |
| $\Leftrightarrow 0 < m^2+2mn+n^2-4mn \Leftrightarrow 0 < (m-n)^2$ | A1cso | For a correct proof and conclusion |
| So $S$ always has the smaller variance and is the better estimator | | |

---
4 A biased coin has a probability $p$ of landing on heads, where $0 < p < 1$ Simon spins the coin $n$ times and the random variable $X$ represents the number of heads. Taruni spins the coin $m$ times, $m \neq n$, and the random variable $Y$ represents the number of heads.

Simon and Taruni want to combine their results to find unbiased estimators of $p$.\\
Simon proposes the estimator $S = \frac { X + Y } { m + n }$ and Taruni proposes $T = \frac { 1 } { 2 } \left[ \frac { X } { n } + \frac { Y } { m } \right]$
\begin{enumerate}[label=(\alph*)]
\item Show that both $S$ and $T$ are unbiased estimators of $p$.
\item Prove that, for all values of $m$ and $n , S$ is the better estimator.
\end{enumerate}

\hfill \mbox{\textit{Edexcel FS2 2020 Q4 [7]}}