WJEC Further Unit 5 2023 June — Question 2 19 marks

Exam BoardWJEC
ModuleFurther Unit 5 (Further Unit 5)
Year2023
SessionJune
Marks19
PaperDownload PDF ↗
Mark schemeDownload PDF ↗
TopicMoment generating functions
TypeShow unbiased estimator
DifficultyStandard +0.3 This is a straightforward application of standard properties of expectation and variance for linear combinations of sample means. Parts (a)-(d) involve routine algebraic manipulation of E(T) and Var(T) formulas, while part (e) requires basic calculus (differentiation) to minimize variance. All techniques are standard textbook exercises for Further Maths statistics with no novel insight required.
Spec5.04a Linear combinations: E(aX+bY), Var(aX+bY)5.05b Unbiased estimates: of population mean and variance

2. The random variables \(X\) and \(Y\) are independent, with \(X\) having mean \(\mu\) and variance \(\sigma ^ { 2 }\), and \(Y\) having mean \(\mu\) and variance \(k \sigma ^ { 2 }\), where \(k\) is a positive constant. Let \(\bar { X }\) denote the mean of a random sample of 20 observations of \(X\), and let \(\bar { Y }\) denote the mean of a random sample of 25 observations of \(Y\).
  1. Given that \(T _ { 1 } = \frac { 3 \bar { X } + 7 \bar { Y } } { 10 }\), show that \(T _ { 1 }\) is an unbiased estimator for \(\mu\).
  2. Given that \(T _ { 2 } = \frac { \bar { X } + a ^ { 2 } \bar { Y } } { 1 + a } , a > 0\), and \(T _ { 2 }\) is an unbiased estimator for \(\mu\), prove that \(a = 1\).
  3. Find and simplify expressions for the variances of \(T _ { 1 }\) and \(T _ { 2 }\).
  4. Show that the value of \(k\) for which \(T _ { 1 }\) and \(T _ { 2 }\) are equally good estimators is \(\frac { 5 } { 6 }\).
  5. Given that \(T _ { 3 } = ( 1 - \lambda ) \bar { X } + \lambda \bar { Y }\), find an expression for \(\lambda\), in terms of \(k\), for which \(T _ { 3 }\) has the smallest possible variance.

AnswerMarks Guidance
(a)\(E(T_1) = \frac{3E(\bar{X}) + 7E(\bar{Y})}{10}\) M1
\(E(T_1) = \frac{3\mu + 7\mu}{10}\)
\(E(T_1) = \mu\), therefore \(T_1\) is an unbiased estimator for \(\mu\).A1 Convincing
(b)\(E(T_2) = \frac{E(\bar{X}) + a^2E(\bar{Y})}{1 + a}\)
To be an unbiased estimator for \(\mu\): \(\frac{\mu + a^2\mu}{1 + a} = \mu\)M1 Forming an equation in \(\mu\). si
\(1 + a^2 = 1 + a\)A1 oe
\(a = 0\) or \(a = 1\). \(a\) is positive \(\therefore a = 1\) (so \(T_2 = \frac{\bar{x}+\bar{y}}{2}\))A1 Must reject \(a = 0\). If M0, then SC1 for verification only
(c)\(\text{Var}(T_1) = \frac{3^2 \times \text{Var}(\bar{X}) + 7^2 \times \text{Var}(\bar{Y})}{10^2}\) M1
\(\text{Var}(T_1) = \frac{9 \times \frac{\sigma^2}{20} + 49 \times \frac{k\sigma^2}{25}}{100}\)M1 Use of \(\text{Var}(\bar{W}) = \text{Var}(W)/n\)
\(\text{Var}(T_1) = \frac{45\sigma^2 + 196k\sigma^2}{10000} = \frac{\sigma^2}{10000}(45 + 196k)\)A1 oe, cao. \(\text{Var}(T_1) = \frac{9\sigma^2}{2000} + \frac{49k\sigma^2}{2500}\)
\(\text{Var}(T_2) = \frac{1}{4}(\text{Var}(\bar{X}) + \text{Var}(\bar{Y}))\)M1
\(\text{Var}(T_2) = \frac{1}{4}\left(\frac{\sigma^2}{20} + \frac{k\sigma^2}{25}\right)\)M1
\(\text{Var}(T_2) = \frac{\sigma^2}{400}(5 + 4k)\)A1 oe \(\text{Var}(T_2) = \frac{\sigma^2}{80} + \frac{k\sigma^2}{100}\). If left in terms of \(a\): \(\text{Var}(T_2) = \frac{\sigma^2(5 + 4a^2k)}{100(1 + a)^2}\)
(d)\(\frac{\sigma^2}{400}(5 + 4k) = \frac{45\sigma^2 + 196k\sigma^2}{10000}\) M1
\(\frac{10000}{400}(5 + 4k) = 45 + 196k\) or \(25(5 + 4k) = 45 + 196k\)m1 Forming an equation in \(k\)
\(125 + 100k = 45 + 196k\)
\(k = \frac{5}{6}\)A1 Convincing. *ag
(e)\(V = \text{Var}(T_3) = (1 - \lambda)^2 \times \text{Var}(\bar{X}) + \lambda^2 \times \text{Var}(\bar{Y})\) B1
\(V = \text{Var}(T_3) = (1 - \lambda)^2 \times \frac{\sigma^2}{20} + \lambda^2 \times \frac{k\sigma^2}{25}\)
\(\frac{dV}{d\lambda} = \frac{-2(1-\lambda)\sigma^2}{20} + \frac{2\lambda k\sigma^2}{25}\)M1 M1 for expression for \(\frac{dV}{d\lambda}\). At least 1 term correct
Smallest variance is when \(\frac{dV}{d\lambda} = 0\)M1 M1 for setting \(\frac{dV}{d\lambda} = 0\) and attempt to solve.
\(\frac{2k\sigma^2}{25} = \frac{2(1-\lambda)\sigma^2}{20}\)
\(\lambda k = \frac{5}{4}(1 - \lambda)\)
\(\lambda k + \frac{5\lambda}{4} = \frac{5}{4}\)
\(\lambda\left(\frac{4k + 5}{4}\right) = \frac{5}{4}\)
\(\lambda = \frac{5}{4k + 5}\)A1 cao
\(\frac{d^2V}{d\lambda^2} = \frac{\sigma^2}{10} + \frac{2k\sigma^2}{25} > 0\)E1 E1 for verifying minimum, oe method
Therefore, it is a minimum.
Total [19]
(a) | $E(T_1) = \frac{3E(\bar{X}) + 7E(\bar{Y})}{10}$ | M1 | |
| $E(T_1) = \frac{3\mu + 7\mu}{10}$ | | |
| $E(T_1) = \mu$, therefore $T_1$ is an unbiased estimator for $\mu$. | A1 | Convincing |

(b) | $E(T_2) = \frac{E(\bar{X}) + a^2E(\bar{Y})}{1 + a}$ | | |
| To be an unbiased estimator for $\mu$: $\frac{\mu + a^2\mu}{1 + a} = \mu$ | M1 | Forming an equation in $\mu$. si |
| $1 + a^2 = 1 + a$ | A1 | oe |
| $a = 0$ or $a = 1$. $a$ is positive $\therefore a = 1$ (so $T_2 = \frac{\bar{x}+\bar{y}}{2}$) | A1 | Must reject $a = 0$. If M0, then SC1 for verification only |

(c) | $\text{Var}(T_1) = \frac{3^2 \times \text{Var}(\bar{X}) + 7^2 \times \text{Var}(\bar{Y})}{10^2}$ | M1 | Use of $\text{Var}(cW) = c^2\text{Var}(W)$ |
| $\text{Var}(T_1) = \frac{9 \times \frac{\sigma^2}{20} + 49 \times \frac{k\sigma^2}{25}}{100}$ | M1 | Use of $\text{Var}(\bar{W}) = \text{Var}(W)/n$ |
| $\text{Var}(T_1) = \frac{45\sigma^2 + 196k\sigma^2}{10000} = \frac{\sigma^2}{10000}(45 + 196k)$ | A1 | oe, cao. $\text{Var}(T_1) = \frac{9\sigma^2}{2000} + \frac{49k\sigma^2}{2500}$ |
| $\text{Var}(T_2) = \frac{1}{4}(\text{Var}(\bar{X}) + \text{Var}(\bar{Y}))$ | M1 | |
| $\text{Var}(T_2) = \frac{1}{4}\left(\frac{\sigma^2}{20} + \frac{k\sigma^2}{25}\right)$ | M1 | |
| $\text{Var}(T_2) = \frac{\sigma^2}{400}(5 + 4k)$ | A1 | oe $\text{Var}(T_2) = \frac{\sigma^2}{80} + \frac{k\sigma^2}{100}$. If left in terms of $a$: $\text{Var}(T_2) = \frac{\sigma^2(5 + 4a^2k)}{100(1 + a)^2}$ |

(d) | $\frac{\sigma^2}{400}(5 + 4k) = \frac{45\sigma^2 + 196k\sigma^2}{10000}$ | M1 | M1 for setting their $\text{Var}(T_1) = \text{Var}(T_2)$ |
| $\frac{10000}{400}(5 + 4k) = 45 + 196k$ or $25(5 + 4k) = 45 + 196k$ | m1 | Forming an equation in $k$ |
| $125 + 100k = 45 + 196k$ | | |
| $k = \frac{5}{6}$ | A1 | Convincing. *ag |

(e) | $V = \text{Var}(T_3) = (1 - \lambda)^2 \times \text{Var}(\bar{X}) + \lambda^2 \times \text{Var}(\bar{Y})$ | B1 | cao |
| $V = \text{Var}(T_3) = (1 - \lambda)^2 \times \frac{\sigma^2}{20} + \lambda^2 \times \frac{k\sigma^2}{25}$ | | |
| $\frac{dV}{d\lambda} = \frac{-2(1-\lambda)\sigma^2}{20} + \frac{2\lambda k\sigma^2}{25}$ | M1 | M1 for expression for $\frac{dV}{d\lambda}$. At least 1 term correct |
| Smallest variance is when $\frac{dV}{d\lambda} = 0$ | M1 | M1 for setting $\frac{dV}{d\lambda} = 0$ and attempt to solve. |
| $\frac{2k\sigma^2}{25} = \frac{2(1-\lambda)\sigma^2}{20}$ | | |
| $\lambda k = \frac{5}{4}(1 - \lambda)$ | | |
| $\lambda k + \frac{5\lambda}{4} = \frac{5}{4}$ | | |
| $\lambda\left(\frac{4k + 5}{4}\right) = \frac{5}{4}$ | | |
| $\lambda = \frac{5}{4k + 5}$ | A1 | cao |
| $\frac{d^2V}{d\lambda^2} = \frac{\sigma^2}{10} + \frac{2k\sigma^2}{25} > 0$ | E1 | E1 for verifying minimum, oe method |
| Therefore, it is a minimum. | | |

| **Total [19]** | | |

---
2. The random variables $X$ and $Y$ are independent, with $X$ having mean $\mu$ and variance $\sigma ^ { 2 }$, and $Y$ having mean $\mu$ and variance $k \sigma ^ { 2 }$, where $k$ is a positive constant.

Let $\bar { X }$ denote the mean of a random sample of 20 observations of $X$, and let $\bar { Y }$ denote the mean of a random sample of 25 observations of $Y$.
\begin{enumerate}[label=(\alph*)]
\item Given that $T _ { 1 } = \frac { 3 \bar { X } + 7 \bar { Y } } { 10 }$, show that $T _ { 1 }$ is an unbiased estimator for $\mu$.
\item Given that $T _ { 2 } = \frac { \bar { X } + a ^ { 2 } \bar { Y } } { 1 + a } , a > 0$, and $T _ { 2 }$ is an unbiased estimator for $\mu$, prove that $a = 1$.
\item Find and simplify expressions for the variances of $T _ { 1 }$ and $T _ { 2 }$.
\item Show that the value of $k$ for which $T _ { 1 }$ and $T _ { 2 }$ are equally good estimators is $\frac { 5 } { 6 }$.
\item Given that $T _ { 3 } = ( 1 - \lambda ) \bar { X } + \lambda \bar { Y }$, find an expression for $\lambda$, in terms of $k$, for which $T _ { 3 }$ has the smallest possible variance.
\end{enumerate}

\hfill \mbox{\textit{WJEC Further Unit 5 2023 Q2 [19]}}