6. The independent random variables \(X _ { 1 }\) and \(X _ { 2 }\) are each distributed \(\mathrm { B } ( n , p )\), where \(n > 1\) An unbiased estimator for \(p\) is given by
$$\hat { p } = \frac { a X _ { 1 } + b X _ { 2 } } { n }$$
where \(a\) and \(b\) are constants.
[0pt]
[You may assume that if \(X _ { 1 }\) and \(X _ { 2 }\) are independent then \(\mathrm { E } \left( X _ { 1 } X _ { 2 } \right) = \mathrm { E } \left( X _ { 1 } \right) \mathrm { E } \left( X _ { 2 } \right)\) ]
- Show that \(a + b = 1\)
- Show that \(\operatorname { Var } ( \hat { p } ) = \frac { \left( 2 a ^ { 2 } - 2 a + 1 \right) p ( 1 - p ) } { n }\)
- Hence, justifying your answer, determine the value of \(a\) and the value of \(b\) for which \(\hat { p }\) has minimum variance.
- Show that \(\hat { p } ^ { 2 }\) is a biased estimator for \(p ^ { 2 }\)
- Show that the bias \(\rightarrow 0\) as \(n \rightarrow \infty\)
- By considering \(\mathrm { E } \left[ X _ { 1 } \left( X _ { 1 } - 1 \right) \right]\) find an unbiased estimator for \(p ^ { 2 }\)