Moment generating functions

36 questions · 15 question types identified

Sort by: Question count | Difficulty
Derive MGF from PDF

A question is this type if and only if it asks to derive or show the moment generating function by integrating the PDF with e^(tx).

9 Standard +0.9
25.0% of questions
Show example »
3 The continuous random variable \(X\) has probability density function given by $$\mathrm { f } ( x ) = \begin{cases} \mathrm { e } ^ { 2 x } & x < 0 \\ \mathrm { e } ^ { - 2 x } & x \geqslant 0 \end{cases}$$
  1. Show that the moment generating function of \(X\) is \(\frac { 4 } { 4 - t ^ { 2 } }\), where \(| t | < 2\), and explain why the condition \(| t | < 2\) is necessary.
  2. Find \(\operatorname { Var } ( X )\).
View full question →
Easiest question Standard +0.3 »
2 The continuous random variable \(X\) has probability density function given by $$f ( x ) = \begin{cases} 4 x e ^ { - 2 x } & x \geqslant 0 \\ 0 & \text { otherwise } \end{cases}$$
  1. Show that the moment generating function ( mgf ) of \(X\) is $$\frac { 4 } { ( 2 - t ) ^ { 2 } } , \text { where } | t | < 2$$
  2. Explain why the \(\operatorname { mgf }\) of \(- X\) is \(\frac { 4 } { ( 2 + t ) ^ { 2 } }\).
  3. Two random observations of \(X\) are denoted by \(X _ { 1 }\) and \(X _ { 2 }\). What is the \(\operatorname { mgf }\) of \(X _ { 1 } - X _ { 2 }\) ?
View full question →
Hardest question Challenging +1.8 »
2 The random variable \(Z\) has the standard Normal distribution. The random variable \(Y\) is defined by \(Y = Z ^ { 2 }\).
You are given that \(Y\) has the following probability density function. $$\mathrm { f } ( y ) = \frac { 1 } { \sqrt { 2 \pi y } } \mathrm { e } ^ { - \frac { 1 } { 2 } y } , \quad y > 0$$
  1. Show that the moment generating function (mgf) of \(Y\) is given by $$\mathrm { M } _ { Y } ( \theta ) = ( 1 - 2 \theta ) ^ { - \frac { 1 } { 2 } }$$
  2. Use the mgf to obtain \(\mathrm { E } ( Y )\) and \(\operatorname { Var } ( Y )\). The random variable \(U\) is defined by $$U = Z _ { 1 } ^ { 2 } + Z _ { 2 } ^ { 2 } + \ldots + Z _ { n } ^ { 2 } ,$$ where \(Z _ { 1 } , Z _ { 2 } , \ldots , Z _ { n }\) are independent standard Normal random variables.
  3. State an appropriate general theorem for mgfs and hence write down the mgf of \(U\). State the values of \(\mathrm { E } ( U )\) and \(\operatorname { Var } ( U )\). The random variable \(W\) is defined by $$W = \frac { U - n } { \sqrt { 2 n } }$$
  4. Show that the logarithm of the \(\operatorname { mgf }\) of \(W\) is $$- \sqrt { \frac { n } { 2 } } \theta - \frac { n } { 2 } \ln \left( 1 - \sqrt { \frac { 2 } { n } } \theta \right) .$$ Use the series expansion of \(\ln ( 1 - t )\) to show that, as \(n \rightarrow \infty\), this expression tends to \(\frac { 1 } { 2 } \theta ^ { 2 }\).
    State what this implies about the distribution of \(W\) for large \(n\).
View full question →
Show unbiased estimator

A question is this type if and only if it asks to prove that a given estimator is unbiased by showing E(estimator) equals the parameter.

6 Standard +0.7
16.7% of questions
Show example »
  1. The random variable \(X\) is such that \(\text{E}(X) = a\theta + b\), where \(a\) and \(b\) are constants and \(\theta\) is a parameter. Show that \(\frac{X - b}{a}\) is an unbiased estimator of \(\theta\). [2]
  2. The continuous random variable \(X\) has probability density function given by $$f(x) = \begin{cases} \frac{1}{8}(\theta + 4 - x) & \theta \leq x \leq \theta + 4, \\ 0 & \text{otherwise}. \end{cases}$$ Find \(\text{E}(X)\) and hence find an unbiased estimator of \(\theta\). [7]
View full question →
MGF of transformed variable

A question is this type if and only if it asks to find or explain the MGF of a transformed random variable (e.g., -X, aX+b, or X²).

4 Standard +0.6
11.1% of questions
Show example »
5 The random variable \(X\) has probability density function \(\mathrm { f } ( x )\), where $$\mathrm { f } ( x ) = \begin{cases} k \mathrm { e } ^ { - k x } & x \geqslant 0 \\ 0 & x < 0 \end{cases}$$ and \(k\) is a positive constant.
  1. Show that the moment generating function of \(X\) is \(\mathrm { M } _ { X } ( t ) = k ( k - t ) ^ { - 1 } , t < k\).
  2. Use the moment generating function to find \(\mathrm { E } ( X )\) and \(\operatorname { Var } ( X )\).
  3. Show that the moment generating function of \(- X\) is \(k ( k + t ) ^ { - 1 }\).
  4. \(X _ { 1 }\) and \(X _ { 2 }\) are two independent observations of \(X\). Use the moment generating function of \(X _ { 1 } - X _ { 2 }\) to find the value of \(\mathrm { E } \left[ \left( X _ { 1 } - X _ { 2 } \right) ^ { 2 } \right]\).
View full question →
Compare estimator properties

A question is this type if and only if it asks to compare two or more estimators based on bias, variance, or efficiency criteria.

4 Standard +0.6
11.1% of questions
Show example »
Faults occur in a roll of material at a rate of \(\lambda\) per m\(^2\). To estimate \(\lambda\), three pieces of material of sizes 3 m\(^2\), 7 m\(^2\) and 10 m\(^2\) are selected and the number of faults \(X_1\), \(X_2\) and \(X_3\) respectively are recorded. The estimator \(\hat{\lambda}\), where $$\hat{\lambda} = k(X_1 + X_2 + X_3)$$ is an unbiased estimator of \(\lambda\).
  1. Write down the distributions of \(X_1\), \(X_2\) and \(X_3\) and find the value of \(k\). [4]
  2. Find Var(\(\hat{\lambda}\)). [3]
A random sample of \(n\) pieces of this material, each of size 4 m\(^2\), was taken. The number of faults on each piece, \(Y\), was recorded.
  1. Show that \(\frac{1}{4}\bar{Y}\) is an unbiased estimator of \(\lambda\). [2]
  2. Find Var(\(\frac{1}{4}\bar{Y}\)). [3]
  3. Find the minimum value of \(n\) for which \(\frac{1}{4}\bar{Y}\) becomes a better estimator of \(\lambda\) than \(\hat{\lambda}\). [2]
View full question →
Use MGF to find moments

A question is this type if and only if it asks to use the MGF (by differentiation) to find mean, variance, or higher moments of a distribution.

2 Standard +0.3
5.6% of questions
Show example »
3 The moment generating function of a random variable \(X\) is \(( 1 - 2 t ) ^ { - 3 }\).
  1. Find the mean and variance of \(X\).
  2. \(X _ { 1 }\) and \(X _ { 2 }\) are two independent observations of \(X\). Find \(\mathrm { E } \left[ \left( X _ { 1 } + X _ { 2 } \right) ^ { 3 } \right]\).
View full question →
Maximum likelihood estimation

A question is this type if and only if it asks to find the maximum likelihood estimator by maximizing the likelihood or log-likelihood function.

2 Challenging +1.0
5.6% of questions
Show example »
1 The random variable \(X\) has the Normal distribution with mean 0 and variance \(\theta\), so that its probability density function is $$\mathrm { f } ( x ) = \frac { 1 } { \sqrt { 2 \pi \theta } } \mathrm { e } ^ { - x ^ { 2 } / 2 \theta } , \quad - \infty < x < \infty$$ where \(\theta ( \theta > 0 )\) is unknown. A random sample of \(n\) observations from \(X\) is denoted by \(X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\).
  1. Find \(\hat { \theta }\), the maximum likelihood estimator of \(\theta\).
  2. Show that \(\hat { \theta }\) is an unbiased estimator of \(\theta\).
  3. In large samples, the variance of \(\hat { \theta }\) may be estimated by \(\frac { 2 \hat { \theta } ^ { 2 } } { n }\). Use this and the results of parts (i) and (ii) to find an approximate \(95 \%\) confidence interval for \(\theta\) in the case when \(n = 100\) and \(\Sigma X _ { i } ^ { 2 } = 1000\).
View full question →
Construct combined estimator

A question is this type if and only if it asks to find coefficients for a weighted combination of estimators that is unbiased or has minimum variance.

2 Standard +0.8
5.6% of questions
Show example »
A random sample of three independent variables \(X_1\), \(X_2\) and \(X_3\) is taken from a distribution with mean \(\mu\) and variance \(\sigma^2\).
  1. Show that \(\frac{2}{5}X_1 - \frac{1}{5}X_2 + \frac{4}{5}X_3\) is an unbiased estimator for \(\mu\). [3]
An unbiased estimator for \(\mu\) is given by \(\hat{\mu} = aX_1 + bX_2\) where \(a\) and \(b\) are constants.
  1. Show that Var(\(\hat{\mu}\)) = \((2a^2 - 2a + 1)\sigma^2\). [6]
  2. Hence determine the value of \(a\) and the value of \(b\) for which \(\hat{\mu}\) has minimum variance. [5]
View full question →
MGF series expansion

A question is this type if and only if it asks to use the series expansion of e^x to extract moments from the MGF expression.

2 Challenging +1.0
5.6% of questions
Show example »
6 The random variable \(X\) has a uniform distribution on the interval \([ - 1,1 ]\), so that its probability density function is given by $$f ( x ) = \begin{cases} \frac { 1 } { 2 } & - 1 \leqslant x \leqslant 1 \\ 0 & \text { otherwise } \end{cases}$$
  1. Show from the definition of the moment generating function that the moment generating function of \(X\) is \(\frac { \sinh t } { t }\).
  2. By using the series expansion of \(\sinh t\), find the variance of \(X\) and the value of \(\mathrm { E } \left( X ^ { 4 } \right)\).
View full question →
Verify MGF convergence condition

A question is this type if and only if it asks to state or explain why a condition like |t| < k is necessary for the MGF to exist.

1 Standard +0.3
2.8% of questions
Show example »
6 A continuous random variable \(X\) has probability density function $$\mathrm { f } ( x ) = \begin{cases} 4 x \mathrm { e } ^ { - 2 x } & x \geqslant 0 \\ 0 & \text { otherwise } . \end{cases}$$
  1. Show that the moment generating function \(\mathrm { M } _ { X } ( t )\) of \(X\) is \(\frac { 4 } { ( 2 - t ) ^ { 2 } }\). You may assume that \(x \mathrm { e } ^ { - k x } \rightarrow 0\) as \(x \rightarrow + \infty\).
  2. What condition on \(t\) is needed in finding \(\mathrm { M } _ { X } ( t )\) ?
  3. \(Y\) is the sum of three independent observations of \(X\). Find the moment generating function of \(Y\), and use your answer to find \(\operatorname { Var } ( Y )\).
View full question →
Calculate moments from PDF

A question is this type if and only if it asks to find E(X), E(X²), or Var(X) by direct integration of the PDF (not using MGF).

1 Challenging +1.2
2.8% of questions
Show example »
7. \includegraphics[max width=\textwidth, alt={}, center]{65369843-222f-48b2-b8cd-a1c304eac3d9-6_707_718_347_660} The diagram above shows a cyclic quadrilateral \(A B C D\), where \(\widehat { B A D } = \alpha , \widehat { B C D } = \beta\) and \(\alpha + \beta = 180 ^ { \circ }\). These angles are measured.
The random variables \(X\) and \(Y\) denote the measured values, in degrees, of \(\widehat { B A D }\) and \(\widehat { B C D }\) respectively. You are given that \(X\) and \(Y\) are independently normally distributed with standard deviation \(\sigma\) and means \(\alpha\) and \(\beta\) respectively.
  1. Calculate, correct to two decimal places, the probability that \(X + Y\) will differ from \(180 ^ { \circ }\) by less than \(\sigma\).
  2. Show that \(T _ { 1 } = 45 ^ { \circ } + \frac { 1 } { 4 } ( 3 X - Y )\) is an unbiased estimator for \(\alpha\) and verify that it is a better estimator than \(X\) for \(\alpha\).
  3. Now consider \(T _ { 2 } = \lambda X + ( 1 - \lambda ) \left( 180 ^ { \circ } - Y \right)\).
    1. Show that \(T _ { 2 }\) is an unbiased estimator for \(\alpha\) for all values of \(\lambda\).
    2. Find \(\operatorname { Var } \left( T _ { 2 } \right)\) in terms of \(\lambda\) and \(\sigma\).
    3. Hence determine the value of \(\lambda\) which gives the best unbiased estimator for \(\alpha\).
View full question →
Verify PDF integrates to 1

A question is this type if and only if it asks to verify or show that the integral of a given PDF over its domain equals 1.

1 Standard +0.3
2.8% of questions
Show example »
1 The random variable \(X\) has the following probability density function, in which \(a\) is a (positive) parameter. $$\mathrm { f } ( x ) = \frac { 2 } { a } x \mathrm { e } ^ { - x ^ { 2 } / a } , \quad x \geqslant 0 .$$
  1. Verify that \(\int _ { 0 } ^ { \infty } \mathrm { f } ( x ) \mathrm { d } x = 1\).
  2. Show that \(\mathrm { E } \left( X ^ { 2 } \right) = a\) and \(\mathrm { E } \left( X ^ { 4 } \right) = 2 a ^ { 2 }\). The parameter \(a\) is to be estimated by maximum likelihood based on an independent random sample from the distribution, \(X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\).
  3. Show that the logarithm of the likelihood function is $$n \ln 2 - n \ln a + \sum _ { i = 1 } ^ { n } \ln X _ { i } - \frac { 1 } { a } \sum _ { i = 1 } ^ { n } X _ { i } ^ { 2 }$$ Hence obtain the maximum likelihood estimator, \(\hat { a }\), for \(a\).
    [0pt] [You are not required to verify that any turning point you find is a maximum.]
  4. Using the results from part (ii), show that \(\hat { a }\) is unbiased for \(a\) and find the variance of \(\hat { a }\).
  5. In a particular random sample from this distribution, \(n = 100\) and \(\sum x _ { i } ^ { 2 } = 147.1\). Obtain an approximate 95\% confidence interval for \(a\). (You may assume that the Central Limit Theorem holds in this case.) Option 2: Generating Functions
View full question →
Find parameter from PDF

A question is this type if and only if it asks to find a constant k or parameter in a PDF by using the condition that the PDF integrates to 1.

1 Standard +0.3
2.8% of questions
Show example »
7 The continuous random variable \(X\) has probability density function $$f ( x ) = \left\{ \begin{array} { c l } \frac { k } { ( x + \theta ) ^ { 5 } } & \text { for } x \geqslant 0 \\ 0 & \text { otherwise } \end{array} \right.$$ where \(k\) is a positive constant and \(\theta\) is a parameter taking positive values.
  1. Find an expression for \(k\) in terms of \(\theta\).
  2. Show that \(\mathrm { E } ( X ) = \frac { 1 } { 3 } \theta\). You are given that \(\operatorname { Var } ( X ) = \frac { 2 } { 9 } \theta ^ { 2 }\). A random sample \(X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\) of \(n\) observations of \(X\) is obtained. The estimator \(T _ { 1 }\) is defined as \(T _ { 1 } = \frac { 3 } { n } \sum _ { i = 1 } ^ { n } X _ { i }\).
  3. Show that \(T _ { 1 }\) is an unbiased estimator of \(\theta\), and find the variance of \(T _ { 1 }\).
  4. A second unbiased estimator \(T _ { 2 }\) is defined by \(T _ { 2 } = \frac { 1 } { 3 } \left( X _ { 1 } + 3 X _ { 2 } + 5 X _ { 3 } \right)\). For the case \(n = 3\), which of \(T _ { 1 }\) and \(T _ { 2 }\) is more efficient? \section*{OCR}
View full question →
Derive MGF for discrete distribution

A question is this type if and only if it asks to obtain the MGF for a discrete random variable by summing e^(tx) times probabilities.

1 Challenging +1.8
2.8% of questions
Show example »
2 The random variable \(X\) takes values \(- 2,0\) and 2 , each with probability \(\frac { 1 } { 3 }\).
  1. Write down the values of
    (A) \(\mu\), the mean of \(X\),
    (B) \(\mathrm { E } \left( X ^ { 2 } \right)\),
    (C) \(\sigma ^ { 2 }\), the variance of \(X\).
  2. Obtain the moment generating function (mgf) of \(X\). A random sample of \(n\) independent observations on \(X\) has sample mean \(\bar { X }\), and the standardised mean is denoted by \(Z\) where $$Z = \frac { \bar { X } - \mu } { \frac { \sigma } { \sqrt { n } } }$$
  3. Stating carefully the required general results for mgfs of sums and of linear transformations, show that the mgf of \(Z\) is $$M _ { Z } ( \theta ) = \left\{ \frac { 1 } { 3 } \left( 1 + e ^ { \frac { \theta \sqrt { 3 } } { \sqrt { 2 n } } } + e ^ { - \frac { \theta \sqrt { 3 } } { \sqrt { 2 n } } } \right) \right\} ^ { n } .$$
  4. By expanding the exponential functions in \(\mathrm { M } _ { Z } ( \theta )\), show that, for large \(n\), $$\mathrm { M } _ { Z } ( \theta ) \approx \left( 1 + \frac { \theta ^ { 2 } } { 2 n } \right) ^ { n }$$
  5. Use the result \(\mathrm { e } ^ { y } = \lim _ { n \rightarrow \infty } \left( 1 + \frac { y } { n } \right) ^ { n }\) to find the limit of \(\mathrm { M } _ { Z } ( \theta )\) as \(n \rightarrow \infty\), and deduce the approximate distribution of \(Z\) for large \(n\).
View full question →
MGF of sum of variables

A question is this type if and only if it asks to find the MGF of a sum of independent random variables by multiplying individual MGFs.

0
0.0% of questions
Find variance of estimator

A question is this type if and only if it asks to calculate the variance of a proposed estimator (not using MGF).

0
0.0% of questions