OCR S4 2007 June — Question 7 15 marks

Exam BoardOCR
ModuleS4 (Statistics 4)
Year2007
SessionJune
Marks15
PaperDownload PDF ↗
Mark schemeDownload PDF ↗
TopicMoment generating functions
TypeShow unbiased estimator
DifficultyStandard +0.3 This is a straightforward S4 question requiring standard techniques: showing unbiasedness by computing E(T₁) = E(2X̄) = 2·θ/2 = θ, finding E(U) and Var(U) using given pdf with routine integration, and comparing efficiency via variance ratio. All steps are mechanical applications of definitions with no novel insight required, making it slightly easier than average A-level maths.
Spec5.05b Unbiased estimates: of population mean and variance5.05c Hypothesis test: normal distribution for population mean

7 The continuous random variable \(X\) has a uniform distribution over the interval \([ 0 , \theta ]\) so that the probability density function is given by $$f ( x ) = \begin{cases} \frac { 1 } { \theta } & 0 \leqslant x \leqslant \theta \\ 0 & \text { otherwise } \end{cases}$$ where \(\theta\) is a positive constant. A sample of \(n\) independent observations of \(X\) is taken and the sample mean is denoted by \(\bar { X }\).
  1. The estimator \(T _ { 1 }\) is defined by \(T _ { 1 } = 2 \bar { X }\). Show that \(T _ { 1 }\) is an unbiased estimator of \(\theta\). It is given that the probability density function of the largest value, \(U\), in the sample is $$g ( u ) = \begin{cases} \frac { n u ^ { n - 1 } } { \theta ^ { n } } & 0 \leqslant u \leqslant \theta \\ 0 & \text { otherwise } \end{cases}$$
  2. Find \(\mathrm { E } ( U )\) and show that \(\operatorname { Var } ( U ) = \frac { n \theta ^ { 2 } } { ( n + 1 ) ^ { 2 } ( n + 2 ) }\).
  3. The estimator \(T _ { 2 }\) is defined by \(T _ { 2 } = \frac { n + 1 } { n } U\). Given that \(T _ { 2 }\) is also an unbiased estimator of \(\theta\), show that \(T _ { 2 }\) is a more efficient estimator than \(T _ { 1 }\) for \(n > 1\).

Question 7:
Part (i):
AnswerMarks Guidance
AnswerMarks Guidance
\(G(y) = P(Y \leq y)\)M1 May be implied by following line
\(= P(X^2 \geq 1/y)\) [or \(P(X > 1/\sqrt{y})\)]A1 Accept strict inequalities
\(= 1 - F(1/\sqrt{y})\)A1
\(= \begin{cases} 0 & y \leq 0 \\ y^2 & 0 \leq y \leq 1 \\ 1 & y > 1 \end{cases}\)A1 4
Part (ii):
AnswerMarks Guidance
AnswerMarks Guidance
Differentiate their \(G(y)\) to obtain \(g(y) = 2y\) for \(0 < y \leq 1\)M1, A1 2
Part (iii):
AnswerMarks Guidance
AnswerMarks Guidance
\(\int_0^1 2y(\sqrt[3]{y})\,dy\)M1 Unsimplified, but with limits
\(= [6y^{7/3}/7]\)B1 OR: Find \(f(x)\), \(\int_1^\infty x^{-2/3}f(x)\,dx\) M1
\(= \frac{6}{7}\)A1 3
## Question 7:

### Part (i):
| Answer | Marks | Guidance |
|--------|-------|----------|
| $G(y) = P(Y \leq y)$ | M1 | May be implied by following line |
| $= P(X^2 \geq 1/y)$ [or $P(X > 1/\sqrt{y})$] | A1 | Accept strict inequalities |
| $= 1 - F(1/\sqrt{y})$ | A1 | |
| $= \begin{cases} 0 & y \leq 0 \\ y^2 & 0 \leq y \leq 1 \\ 1 & y > 1 \end{cases}$ | A1 | **4** | Or $F(x)=P(X\leq x) = P(Y\geq 1/x^2)$ M1; $=1-P(Y<1/x^2)$ A1; $=1-G(y)$ etc A1 A1 |

### Part (ii):
| Answer | Marks | Guidance |
|--------|-------|----------|
| Differentiate their $G(y)$ to obtain $g(y) = 2y$ for $0 < y \leq 1$ | M1, A1 | **2** | Only from G correctly obtained |

### Part (iii):
| Answer | Marks | Guidance |
|--------|-------|----------|
| $\int_0^1 2y(\sqrt[3]{y})\,dy$ | M1 | Unsimplified, but with limits |
| $= [6y^{7/3}/7]$ | B1 | OR: Find $f(x)$, $\int_1^\infty x^{-2/3}f(x)\,dx$ M1 |
| $= \frac{6}{7}$ | A1 | **3** | $=[4x^{-14/3}/(14/3)]$; $\frac{6}{7}$ B1A1; OR: Find $H(z)$, $Z=Y^{1/3}$ |

---
7 The continuous random variable $X$ has a uniform distribution over the interval $[ 0 , \theta ]$ so that the probability density function is given by

$$f ( x ) = \begin{cases} \frac { 1 } { \theta } & 0 \leqslant x \leqslant \theta \\ 0 & \text { otherwise } \end{cases}$$

where $\theta$ is a positive constant. A sample of $n$ independent observations of $X$ is taken and the sample mean is denoted by $\bar { X }$.\\
(i) The estimator $T _ { 1 }$ is defined by $T _ { 1 } = 2 \bar { X }$. Show that $T _ { 1 }$ is an unbiased estimator of $\theta$.

It is given that the probability density function of the largest value, $U$, in the sample is

$$g ( u ) = \begin{cases} \frac { n u ^ { n - 1 } } { \theta ^ { n } } & 0 \leqslant u \leqslant \theta \\ 0 & \text { otherwise } \end{cases}$$

(ii) Find $\mathrm { E } ( U )$ and show that $\operatorname { Var } ( U ) = \frac { n \theta ^ { 2 } } { ( n + 1 ) ^ { 2 } ( n + 2 ) }$.\\
(iii) The estimator $T _ { 2 }$ is defined by $T _ { 2 } = \frac { n + 1 } { n } U$. Given that $T _ { 2 }$ is also an unbiased estimator of $\theta$, show that $T _ { 2 }$ is a more efficient estimator than $T _ { 1 }$ for $n > 1$.

\hfill \mbox{\textit{OCR S4 2007 Q7 [15]}}