7 Two independent observations \(X _ { 1 }\) and \(X _ { 2 }\) are made of a continuous random variable with probability density function
$$f ( x ) = \begin{cases} \frac { 1 } { \theta } & 0 \leqslant x \leqslant \theta
0 & \text { otherwise } \end{cases}$$
where \(\theta\) is a parameter whose value is to be estimated.
- Find \(\mathrm { E } ( X )\).
- Show that \(S _ { 1 } = X _ { 1 } + X _ { 2 }\) is an unbiased estimator of \(\theta\).
\(L\) is the larger of \(X _ { 1 }\) and \(X _ { 2 }\), or their common value if they are equal. - Show that the probability density function of \(L\) is \(\frac { 2 l } { \theta ^ { 2 } }\) for \(0 \leqslant l \leqslant \theta\).
- Find \(\mathrm { E } ( L )\).
- Find an unbiased estimator \(S _ { 2 }\) of \(\theta\), based on \(L\).
- Determine which of the two estimators \(S _ { 1 }\) and \(S _ { 2 }\) is the more efficient.