Chi-squared distribution theory and properties

A question is this type if and only if it involves theoretical properties of the chi-squared distribution such as moment generating functions, deriving expected values and variances, or verifying probability density functions.

3 questions · Standard +0.8

Sort by: Default | Easiest first | Hardest first
OCR S4 2010 June Q4
10 marks Standard +0.8
4 The moment generating function of a continuous random variable \(Y\), which has a \(\chi ^ { 2 }\) distribution with \(n\) degrees of freedom, is \(( 1 - 2 t ) ^ { - \frac { 1 } { 2 } n }\), where \(0 \leqslant t < \frac { 1 } { 2 }\).
  1. Find \(\mathrm { E } ( Y )\) and \(\operatorname { Var } ( Y )\). For the case \(n = 1\), the sum of 60 independent observations of \(Y\) is denoted by \(S\).
  2. Write down the moment generating function of \(S\) and hence identify the distribution of \(S\).
  3. Use a normal approximation to estimate \(\mathrm { P } ( S \geqslant 70 )\).
OCR S4 2018 June Q4
10 marks Standard +0.8
4 The random variable \(X\) has a \(\chi ^ { 2 }\) distribution with \(v\) degrees of freedom. The moment generating function of \(X\) is $$\mathrm { M } _ { X } ( t ) = ( 1 - 2 t ) ^ { - \frac { 1 } { 2 } v }$$
  1. Show that \(\mathrm { E } ( X ) = v\).
  2. Find \(\operatorname { Var } ( X )\).
  3. Obtain the moment generating function of the sum \(Y\) of two independent \(\chi ^ { 2 }\) random variables, one with 6 degrees of freedom and the other with 8 degrees of freedom.
  4. Identify the distribution of \(Y\).
OCR MEI S4 2011 June Q2
24 marks Standard +0.8
2 The random variable \(X\) has the \(\chi _ { n } ^ { 2 }\) distribution. This distribution has moment generating function \(\mathrm { M } ( \theta ) = ( 1 - 2 \theta ) ^ { - \frac { 1 } { 2 } n }\), where \(\theta < \frac { 1 } { 2 }\).
  1. Verify the expression for \(\mathrm { M } ( \theta )\) quoted above for the cases \(n = 2\) and \(n = 4\), given that the probability density functions of \(X\) in these cases are as follows. $$\begin{array} { l l } n = 2 : & \mathrm { f } ( x ) = \frac { 1 } { 2 } \mathrm { e } ^ { - \frac { 1 } { 2 } x } \quad ( x > 0 ) \\ n = 4 : & \mathrm { f } ( x ) = \frac { 1 } { 4 } x \mathrm { e } ^ { - \frac { 1 } { 2 } x } \quad ( x > 0 ) \end{array}$$
  2. For the general case, use \(\mathrm { M } ( \theta )\) to find the mean and variance of \(X\) in terms of \(n\).
  3. \(Y _ { 1 } , Y _ { 2 } , \ldots , Y _ { k }\) are independent random variables, each with the \(\chi _ { 1 } ^ { 2 }\) distribution. Show that \(W = \sum _ { i = 1 } ^ { k } Y _ { i }\) has the \(\chi _ { k } ^ { 2 }\) distribution.
  4. Use the Central Limit Theorem to find an approximation for \(\mathrm { P } ( W < 118.5 )\) for the case \(k = 100\).