MGF of transformed variable

A question is this type if and only if it asks to find or explain the MGF of a transformed random variable (e.g., -X, aX+b, or X²).

4 questions · Standard +0.6

5.03a Continuous random variables: pdf and cdf5.03c Calculate mean/variance: by integration
Sort by: Default | Easiest first | Hardest first
OCR MEI S4 2009 June Q2
24 marks Standard +0.3
2
  1. The random variable \(Z\) has the standard Normal distribution with probability density function $$\mathrm { f } ( z ) = \frac { 1 } { \sqrt { 2 \pi } } \mathrm { e } ^ { - z ^ { 2 } / 2 } , \quad - \infty < z < \infty$$ Obtain the moment generating function of \(Z\).
  2. Let \(\mathrm { M } _ { Y } ( t )\) denote the moment generating function of the random variable \(Y\). Show that the moment generating function of the random variable \(a Y + b\), where \(a\) and \(b\) are constants, is \(\mathrm { e } ^ { b t } \mathrm { M } _ { Y } ( a t )\).
  3. Use the results in parts (i) and (ii) to obtain the moment generating function \(\mathrm { M } _ { X } ( t )\) of the random variable \(X\) having the Normal distribution with parameters \(\mu\) and \(\sigma ^ { 2 }\).
  4. If \(W = \mathrm { e } ^ { X }\) where \(X\) is as in part (iii), \(W\) is said to have a lognormal distribution. Show that, for any positive integer \(k\), the expected value of \(W ^ { k }\) is \(\mathrm { M } _ { X } ( k )\). Use this result to find the expected value and variance of the lognormal distribution.
Pre-U Pre-U 9795/2 2016 Specimen Q5
9 marks Standard +0.8
5 The random variable \(X\) has probability density function \(\mathrm { f } ( x )\), where $$\mathrm { f } ( x ) = \begin{cases} k \mathrm { e } ^ { - k x } & x \geqslant 0 \\ 0 & x < 0 \end{cases}$$ and \(k\) is a positive constant.
  1. Show that the moment generating function of \(X\) is \(\mathrm { M } _ { X } ( t ) = k ( k - t ) ^ { - 1 } , t < k\).
  2. Use the moment generating function to find \(\mathrm { E } ( X )\) and \(\operatorname { Var } ( X )\).
  3. Show that the moment generating function of \(- X\) is \(k ( k + t ) ^ { - 1 }\).
  4. \(X _ { 1 }\) and \(X _ { 2 }\) are two independent observations of \(X\). Use the moment generating function of \(X _ { 1 } - X _ { 2 }\) to find the value of \(\mathrm { E } \left[ \left( X _ { 1 } - X _ { 2 } \right) ^ { 2 } \right]\).
Pre-U Pre-U 9795/2 2019 Specimen Q5
3 marks Standard +0.3
5 The random variable \(X\) has probability density function \(\mathrm { f } ( \mathrm { x } )\), where $$\mathrm { f } ( x ) = \begin{cases} k \mathrm { e } ^ { - k x } & x \geqslant 0 \\ 0 & x < 0 \end{cases}$$ and \(k\) is a positive constant.
  1. Show that the moment generating function of \(X\) is \(\mathrm { M } _ { X } ( t ) = k ( k - t ) ^ { - 1 } , t < k\).
  2. Use the moment generating function to find \(\mathrm { E } ( X )\) and \(\operatorname { Var } ( X )\).
  3. Show that the moment generating function of \(- X\) is \(k ( k + t ) ^ { - 1 }\).
  4. \(X _ { 1 }\) and \(X _ { 2 }\) are two independent observations of \(X\). Use the moment generating function of \(X _ { 1 } - X _ { 2 }\) to find the value of \(\mathrm { E } \left[ \left( X _ { 1 } - X _ { 2 } \right) ^ { 2 } \right]\).
Pre-U Pre-U 9795/2 2020 Specimen Q5
3 marks Standard +0.8
5 The random variable \(X\) has probability density function \(\mathrm { f } ( x )\), where $$\mathrm { f } ( x ) = \begin{cases} k e ^ { - k x } & x \geqslant 0 \\ 0 & x < 0 \end{cases}$$ and \(k\) is a positive constant.
  1. Show that the moment generating function of \(X\) is \(\mathrm { M } _ { X } ( t ) = k ( k - t ) ^ { - 1 } , t < k\).
  2. Use the moment generating function to find \(\mathrm { E } ( X )\) and \(\operatorname { Var } ( X )\).
  3. Show that the moment generating function of \(- X\) is \(k ( k + t ) ^ { - 1 }\).
  4. \(X _ { 1 }\) and \(X _ { 2 }\) are two independent observations of \(X\). Use the moment generating function of \(X _ { 1 } - X _ { 2 }\) to find the value of \(\mathrm { E } \left[ \left( X _ { 1 } - X _ { 2 } \right) ^ { 2 } \right]\).