OCR S4 (Statistics 4) 2015 June

Question 1
View details
1 For the events \(A\) and \(B\) it is given that $$\mathrm { P } ( A ) = 0.6 , \mathrm { P } ( B ) = 0.3 \text { and } \mathrm { P } ( A \text { or } B \text { but not both } ) = 0.4 \text {. }$$
  1. Find \(\mathrm { P } ( A \cap B )\).
  2. Find \(\mathrm { P } \left( A ^ { \prime } \cap B \right)\).
  3. State, giving a reason, whether \(A\) and \(B\) are independent.
Question 2
View details
2 The manufacturer of a painkiller, designed to relieve headaches, claims that people taking the painkiller feel relief in at most 30 minutes, on average. A random sample of eight users of the painkiller recorded the times it took for them to feel relief from their headaches. These times, in minutes, were as follows: $$\begin{array} { l l l l l l l l } 33 & 39 & 29 & 35 & 40 & 32 & 26 & 37 \end{array}$$ Use a Wilcoxon single-sample signed-rank test at the \(5 \%\) significance level to test the manufacturer's claim, stating a necessary assumption.
Question 3
View details
3 The manufacturer of electronic components uses the following process to test the proportion of defective items produced. A random sample of 20 is taken from a large batch of components.
  • If no defective item is found, the batch is accepted.
  • If two or more defective items are found, the batch is rejected.
  • If one defective item is found, a second random sample of 20 is taken. If two or more defective items are found in this second sample, the batch is rejected, otherwise the batch is accepted.
The proportion of defective items in the batch is denoted by \(p\), and \(q = 1 - p\).
  1. Show that the probability that a batch is accepted is \(q ^ { 20 } + 20 p q ^ { 38 } ( q + 20 p )\). For a particular component, \(p = 0.01\).
  2. Given that a batch is accepted, find the probability that it is accepted as a result of the first sample.
Question 4
View details
4 The discrete random variable \(Y\) has probability generating function $$\mathrm { G } _ { Y } ( t ) = 0.09 t ^ { 2 } + 0.24 t ^ { 3 } + 0.34 t ^ { 4 } + 0.24 t ^ { 5 } + 0.09 t ^ { 6 }$$
  1. Find the mean and variance of \(Y\).
    \(Y\) is the sum of two independent observations of a random variable \(X\).
  2. Find the probability generating function of \(X\), expressing your answer as a cubic polynomial in \(t\).
  3. Write down the value of \(\mathrm { P } ( X = 2 )\).
Question 5
View details
5 The random variable \(X\) has a Poisson distribution with mean \(\lambda\). It is given that the moment generating function of \(X\) is \(e ^ { \lambda \left( e ^ { t } - 1 \right) }\).
  1. Use the moment generating function to verify that the mean of \(X\) is \(\lambda\), and to show that the variance of \(X\) is also \(\lambda\).
  2. Five independent observations of \(X\) are added to produce a new variable \(Y\). Find the moment generating function of \(Y\), simplifying your answer.
Question 6
View details
6 In a two-tail Wilcoxon rank-sum test, the sample sizes are 13 and 15. The sum of the ranks for the sample of size 13 is 135 . Carry out the test at the \(5 \%\) level of significance.
Question 7
View details
7 The discrete random variable \(X\) can take the values 0,1 and 2 with equal probabilities.
The random variables \(X _ { 1 }\) and \(X _ { 2 }\) are independent observations of \(X\), and the random variables \(Y\) and \(Z\) are defined as follows:
\(Y\) is the smaller of \(X _ { 1 }\) and \(X _ { 2 }\), or their common value if they are equal; \(Z = \left| X _ { 1 } - X _ { 2 } \right|\).
  1. Draw up a table giving the joint distribution of \(Y\) and \(Z\).
  2. Find \(P ( Y = 0 \mid Z = 0 )\).
  3. Find \(\operatorname { Cov } ( Y , Z )\).
Question 8
View details
8 The independent random variables \(X _ { 1 }\) and \(X _ { 2 }\) have the distributions \(\mathrm { B } \left( n _ { 1 } , \theta \right)\) and \(\mathrm { B } \left( n _ { 2 } , \theta \right)\) respectively. Two possible estimators for \(\theta\) are $$T _ { 1 } = \frac { 1 } { 2 } \left( \frac { X _ { 1 } } { n _ { 1 } } + \frac { X _ { 2 } } { n _ { 2 } } \right) \text { and } T _ { 2 } = \frac { X _ { 1 } + X _ { 2 } } { n _ { 1 } + n _ { 2 } } .$$
  1. Show that \(T _ { 1 }\) and \(T _ { 2 }\) are both unbiased estimators, and calculate their variances.
  2. Find \(\frac { \operatorname { Var } \left( T _ { 1 } \right) } { \operatorname { Var } \left( T _ { 2 } \right) }\). Given that \(n _ { 1 } \neq n _ { 2 }\), use the inequality \(\left( n _ { 1 } - n _ { 2 } \right) ^ { 2 } > 0\) to find which of \(T _ { 1 }\) and \(T _ { 2 }\) is the more efficient estimator.