| Exam Board | OCR |
|---|---|
| Module | S4 (Statistics 4) |
| Year | 2018 |
| Session | June |
| Marks | 15 |
| Paper | Download PDF ↗ |
| Mark scheme | Download PDF ↗ |
| Topic | Discrete Random Variables |
| Type | Showing estimator is unbiased |
| Difficulty | Challenging +1.2 This is a multi-part S4 question on estimation theory requiring standard techniques: finding expectations from pdfs, showing unbiasedness, deriving the pdf of a maximum order statistic, and comparing efficiency via variance. While it involves several steps and the order statistic derivation requires care, all techniques are standard for Further Maths S4 with no novel insights needed—moderately above average difficulty. |
| Spec | 5.03a Continuous random variables: pdf and cdf5.03b Solve problems: using pdf5.05b Unbiased estimates: of population mean and variance |
| Answer | Marks | Guidance |
|---|---|---|
| Answer | Marks | Guidance |
| \(\frac{\theta}{2}\) | B1, [1] |
| Answer | Marks | Guidance |
|---|---|---|
| Answer | Marks | Guidance |
| \(E(S) = E(X_1 + X_2) = 2E(X) = 2\left(\frac{\theta}{2}\right) = \theta\) | B1, [1] | Must be sufficient working |
| Answer | Marks | Guidance |
|---|---|---|
| Answer | Marks | Guidance |
| \(F_L(l) = P(L \leq l) = P(X_1 \leq l) \times P(X_2 \leq l) = \frac{l^2}{\theta^2}\) | M1A1 | |
| \(f_L(l) = \frac{2l}{\theta^2}\) | M1A1, [4] | Must be evidence of differentiation for M1 |
| Answer | Marks | Guidance |
|---|---|---|
| Answer | Mark | Guidance |
| \(\dfrac{2\theta}{3}\) | B1 [1] | Must be from \(\left[\dfrac{2l^3}{3\theta}\right]_0^{\theta}\) |
| Answer | Marks | Guidance |
|---|---|---|
| Answer | Mark | Guidance |
| \(\dfrac{3}{2}\left(\dfrac{2\theta}{3}\right) = \theta\) so \(\dfrac{3L}{2}\) | B1 [1] |
| Answer | Marks | Guidance |
|---|---|---|
| Answer | Mark | Guidance |
| \(\text{Var}(X) = \int_0^{\theta} \dfrac{x^2}{\theta}\,dx - \left(\dfrac{\theta}{2}\right)^2 = \dfrac{\theta^2}{12}\) | M1A1 | |
| \(\text{Var}(S_1) = \dfrac{\theta^2}{6}\) | A1 | |
| \(\text{Var}(L) = \int_0^{\theta} \dfrac{2l^3}{\theta^2}\,dx - \left(\dfrac{2\theta}{3}\right)^2 = \dfrac{\theta^2}{18}\) | M1A1 | |
| \(\text{Var}\!\left(\dfrac{3L}{2}\right) = \dfrac{\theta^2}{8}\) | B1ft | \(\dfrac{9}{4}\text{Var}\,L\) |
| \(\dfrac{3L}{2}\) is more efficient. | B1ft [7] | Must have answers from both variances. |
## Question 7(i):
| Answer | Marks | Guidance |
|--------|-------|----------|
| $\frac{\theta}{2}$ | B1, [1] | |
## Question 7(ii):
| Answer | Marks | Guidance |
|--------|-------|----------|
| $E(S) = E(X_1 + X_2) = 2E(X) = 2\left(\frac{\theta}{2}\right) = \theta$ | B1, [1] | Must be sufficient working |
## Question 7(iii):
| Answer | Marks | Guidance |
|--------|-------|----------|
| $F_L(l) = P(L \leq l) = P(X_1 \leq l) \times P(X_2 \leq l) = \frac{l^2}{\theta^2}$ | M1A1 | |
| $f_L(l) = \frac{2l}{\theta^2}$ | M1A1, [4] | Must be evidence of differentiation for M1 |
## Question 7:
### Part (iv):
| Answer | Mark | Guidance |
|--------|------|----------|
| $\dfrac{2\theta}{3}$ | B1 [1] | Must be from $\left[\dfrac{2l^3}{3\theta}\right]_0^{\theta}$ |
---
### Part (v):
| Answer | Mark | Guidance |
|--------|------|----------|
| $\dfrac{3}{2}\left(\dfrac{2\theta}{3}\right) = \theta$ so $\dfrac{3L}{2}$ | B1 [1] | |
---
### Part (vi):
| Answer | Mark | Guidance |
|--------|------|----------|
| $\text{Var}(X) = \int_0^{\theta} \dfrac{x^2}{\theta}\,dx - \left(\dfrac{\theta}{2}\right)^2 = \dfrac{\theta^2}{12}$ | M1A1 | |
| $\text{Var}(S_1) = \dfrac{\theta^2}{6}$ | A1 | |
| $\text{Var}(L) = \int_0^{\theta} \dfrac{2l^3}{\theta^2}\,dx - \left(\dfrac{2\theta}{3}\right)^2 = \dfrac{\theta^2}{18}$ | M1A1 | |
| $\text{Var}\!\left(\dfrac{3L}{2}\right) = \dfrac{\theta^2}{8}$ | B1ft | $\dfrac{9}{4}\text{Var}\,L$ |
| $\dfrac{3L}{2}$ is more efficient. | B1ft [7] | Must have answers from both variances. |
7 Two independent observations $X _ { 1 }$ and $X _ { 2 }$ are made of a continuous random variable with probability density function
$$f ( x ) = \begin{cases} \frac { 1 } { \theta } & 0 \leqslant x \leqslant \theta \\ 0 & \text { otherwise } \end{cases}$$
where $\theta$ is a parameter whose value is to be estimated.\\
(i) Find $\mathrm { E } ( X )$.\\
(ii) Show that $S _ { 1 } = X _ { 1 } + X _ { 2 }$ is an unbiased estimator of $\theta$.\\
$L$ is the larger of $X _ { 1 }$ and $X _ { 2 }$, or their common value if they are equal.\\
(iii) Show that the probability density function of $L$ is $\frac { 2 l } { \theta ^ { 2 } }$ for $0 \leqslant l \leqslant \theta$.\\
(iv) Find $\mathrm { E } ( L )$.\\
(v) Find an unbiased estimator $S _ { 2 }$ of $\theta$, based on $L$.\\
(vi) Determine which of the two estimators $S _ { 1 }$ and $S _ { 2 }$ is the more efficient.
\hfill \mbox{\textit{OCR S4 2018 Q7 [15]}}