The time (in milliseconds) taken by my computer to perform a particular task is modelled by the random variable \(T\). The probability that it takes more than \(t\) milliseconds to perform this task is given by the expression \(\mathrm { P } ( T > t ) = \frac { k } { t ^ { 2 } }\) for \(t \geqslant 1\), where \(k\) is a constant.
Write down the cumulative distribution function of \(T\) and hence show that \(k = 1\).
Find the probability density function of \(T\).
Find the mean time for the task.
For a different task, the times (in milliseconds) taken by my computer on 10 randomly chosen occasions were as follows.
$$\begin{array} { c c c c c c c c c c }
6.4 & 5.9 & 5.0 & 6.2 & 6.8 & 6.0 & 5.2 & 6.5 & 5.7 & 5.3
\end{array}$$
From past experience it is thought that the median time for this task is 5.4 milliseconds. Carry out a test at the \(5 \%\) level of significance to investigate this, stating your hypotheses carefully.