Moment generating functions

25 questions · 15 question types identified

Derive MGF from PDF

A question is this type if and only if it asks to derive or show the moment generating function by integrating the PDF with e^(tx).

7
28.0% of questions
Show example »
3 The continuous random variable \(X\) has probability density function given by $$\mathrm { f } ( x ) = \begin{cases} \mathrm { e } ^ { 2 x } & x < 0
\mathrm { e } ^ { - 2 x } & x \geqslant 0 \end{cases}$$
  1. Show that the moment generating function of \(X\) is \(\frac { 4 } { 4 - t ^ { 2 } }\), where \(| t | < 2\), and explain why the condition \(| t | < 2\) is necessary.
  2. Find \(\operatorname { Var } ( X )\).
View full question →
Show unbiased estimator

A question is this type if and only if it asks to prove that a given estimator is unbiased by showing E(estimator) equals the parameter.

6
24.0% of questions
Show example »
6 The continuous random variable \(X\) has probability density function given by $$\mathrm { f } ( x ) = \begin{cases} 0 & x < a ,
\mathrm { e } ^ { - ( x - a ) } & x \geqslant a , \end{cases}$$ where \(a\) is a constant. \(X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\) are \(n\) independent observations of \(X\), where \(n \geqslant 4\).
  1. Show that \(\mathrm { E } ( X ) = a + 1\).
    \(T _ { 1 }\) and \(T _ { 2 }\) are proposed estimators of \(a\), where $$T _ { 1 } = X _ { 1 } + 2 X _ { 2 } - X _ { 3 } - X _ { 4 } - 1 \quad \text { and } \quad T _ { 2 } = \frac { X _ { 1 } + X _ { 2 } } { 4 } + \frac { X _ { 3 } + X _ { 4 } + \ldots + X _ { n } } { 2 ( n - 2 ) } - 1 .$$
  2. Show that \(T _ { 1 }\) and \(T _ { 2 }\) are unbiased estimators of \(a\).
  3. Determine which is the more efficient estimator.
  4. Suggest another unbiased estimator of \(a\) using all of the \(n\) observations.
View full question →
Calculate moments from PDF

A question is this type if and only if it asks to find E(X), E(X²), or Var(X) by direct integration of the PDF (not using MGF).

3
12.0% of questions
Show example »
7. The discrete random variable \(X\) has the following probability distribution, where \(\theta\) is an unknown parameter belonging to the interval \(\left( 0 , \frac { 1 } { 3 } \right)\).
Value of \(X\)135
Probability\(\theta\)\(1 - 3 \theta\)\(2 \theta\)
  1. Obtain an expression for \(E ( X )\) in terms of \(\theta\) and show that $$\operatorname { Var } ( X ) = 4 \theta ( 3 - \theta ) .$$ In order to estimate the value of \(\theta\), a random sample of \(n\) observations on \(X\) was obtained and \(\bar { X }\) denotes the sample mean.
    1. Show that $$V = \frac { \bar { X } - 3 } { 2 }$$ is an unbiased estimator for \(\theta\).
    2. Find an expression for the variance of \(V\).
  2. Let \(Y\) denote the number of observations in the random sample that are equal to 1 . Show that $$W = \frac { Y } { n }$$ is an unbiased estimator for \(\theta\) and find an expression for \(\operatorname { Var } ( W )\).
  3. Determine which of \(V\) and \(W\) is the better estimator, explaining your method clearly.
View full question →
Maximum likelihood estimation

A question is this type if and only if it asks to find the maximum likelihood estimator by maximizing the likelihood or log-likelihood function.

2
8.0% of questions
Show example »
1 The random variable \(X\) has the Normal distribution with mean 0 and variance \(\theta\), so that its probability density function is $$\mathrm { f } ( x ) = \frac { 1 } { \sqrt { 2 \pi \theta } } \mathrm { e } ^ { - x ^ { 2 } / 2 \theta } , \quad - \infty < x < \infty$$ where \(\theta ( \theta > 0 )\) is unknown. A random sample of \(n\) observations from \(X\) is denoted by \(X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\).
  1. Find \(\hat { \theta }\), the maximum likelihood estimator of \(\theta\).
  2. Show that \(\hat { \theta }\) is an unbiased estimator of \(\theta\).
  3. In large samples, the variance of \(\hat { \theta }\) may be estimated by \(\frac { 2 \hat { \theta } ^ { 2 } } { n }\). Use this and the results of parts (i) and (ii) to find an approximate \(95 \%\) confidence interval for \(\theta\) in the case when \(n = 100\) and \(\Sigma X _ { i } ^ { 2 } = 1000\).
View full question →
Use MGF to find moments

A question is this type if and only if it asks to use the MGF (by differentiation) to find mean, variance, or higher moments of a distribution.

1
4.0% of questions
Show example »
5 The random variable \(X\) has a Poisson distribution with mean \(\lambda\). It is given that the moment generating function of \(X\) is \(e ^ { \lambda \left( e ^ { t } - 1 \right) }\).
  1. Use the moment generating function to verify that the mean of \(X\) is \(\lambda\), and to show that the variance of \(X\) is also \(\lambda\).
  2. Five independent observations of \(X\) are added to produce a new variable \(Y\). Find the moment generating function of \(Y\), simplifying your answer.
View full question →
MGF of transformed variable

A question is this type if and only if it asks to find or explain the MGF of a transformed random variable (e.g., -X, aX+b, or X²).

1
4.0% of questions
Show example »
2
  1. The random variable \(Z\) has the standard Normal distribution with probability density function $$\mathrm { f } ( z ) = \frac { 1 } { \sqrt { 2 \pi } } \mathrm { e } ^ { - z ^ { 2 } / 2 } , \quad - \infty < z < \infty$$ Obtain the moment generating function of \(Z\).
  2. Let \(\mathrm { M } _ { Y } ( t )\) denote the moment generating function of the random variable \(Y\). Show that the moment generating function of the random variable \(a Y + b\), where \(a\) and \(b\) are constants, is \(\mathrm { e } ^ { b t } \mathrm { M } _ { Y } ( a t )\).
  3. Use the results in parts (i) and (ii) to obtain the moment generating function \(\mathrm { M } _ { X } ( t )\) of the random variable \(X\) having the Normal distribution with parameters \(\mu\) and \(\sigma ^ { 2 }\).
  4. If \(W = \mathrm { e } ^ { X }\) where \(X\) is as in part (iii), \(W\) is said to have a lognormal distribution. Show that, for any positive integer \(k\), the expected value of \(W ^ { k }\) is \(\mathrm { M } _ { X } ( k )\). Use this result to find the expected value and variance of the lognormal distribution.
View full question →
Verify PDF integrates to 1

A question is this type if and only if it asks to verify or show that the integral of a given PDF over its domain equals 1.

1
4.0% of questions
Show example »
1 The random variable \(X\) has the following probability density function, in which \(a\) is a (positive) parameter. $$\mathrm { f } ( x ) = \frac { 2 } { a } x \mathrm { e } ^ { - x ^ { 2 } / a } , \quad x \geqslant 0 .$$
  1. Verify that \(\int _ { 0 } ^ { \infty } \mathrm { f } ( x ) \mathrm { d } x = 1\).
  2. Show that \(\mathrm { E } \left( X ^ { 2 } \right) = a\) and \(\mathrm { E } \left( X ^ { 4 } \right) = 2 a ^ { 2 }\). The parameter \(a\) is to be estimated by maximum likelihood based on an independent random sample from the distribution, \(X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\).
  3. Show that the logarithm of the likelihood function is $$n \ln 2 - n \ln a + \sum _ { i = 1 } ^ { n } \ln X _ { i } - \frac { 1 } { a } \sum _ { i = 1 } ^ { n } X _ { i } ^ { 2 }$$ Hence obtain the maximum likelihood estimator, \(\hat { a }\), for \(a\).
    [0pt] [You are not required to verify that any turning point you find is a maximum.]
  4. Using the results from part (ii), show that \(\hat { a }\) is unbiased for \(a\) and find the variance of \(\hat { a }\).
  5. In a particular random sample from this distribution, \(n = 100\) and \(\sum x _ { i } ^ { 2 } = 147.1\). Obtain an approximate 95\% confidence interval for \(a\). (You may assume that the Central Limit Theorem holds in this case.) Option 2: Generating Functions
View full question →
Find parameter from PDF

A question is this type if and only if it asks to find a constant k or parameter in a PDF by using the condition that the PDF integrates to 1.

1
4.0% of questions
Show example »
7 The continuous random variable \(X\) has probability density function $$f ( x ) = \left\{ \begin{array} { c l } \frac { k } { ( x + \theta ) ^ { 5 } } & \text { for } x \geqslant 0
0 & \text { otherwise } \end{array} \right.$$ where \(k\) is a positive constant and \(\theta\) is a parameter taking positive values.
  1. Find an expression for \(k\) in terms of \(\theta\).
  2. Show that \(\mathrm { E } ( X ) = \frac { 1 } { 3 } \theta\). You are given that \(\operatorname { Var } ( X ) = \frac { 2 } { 9 } \theta ^ { 2 }\). A random sample \(X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\) of \(n\) observations of \(X\) is obtained. The estimator \(T _ { 1 }\) is defined as \(T _ { 1 } = \frac { 3 } { n } \sum _ { i = 1 } ^ { n } X _ { i }\).
  3. Show that \(T _ { 1 }\) is an unbiased estimator of \(\theta\), and find the variance of \(T _ { 1 }\).
  4. A second unbiased estimator \(T _ { 2 }\) is defined by \(T _ { 2 } = \frac { 1 } { 3 } \left( X _ { 1 } + 3 X _ { 2 } + 5 X _ { 3 } \right)\). For the case \(n = 3\), which of \(T _ { 1 }\) and \(T _ { 2 }\) is more efficient? \section*{OCR}
View full question →
Compare estimator properties

A question is this type if and only if it asks to compare two or more estimators based on bias, variance, or efficiency criteria.

1
4.0% of questions
Show example »
3. A technician is trying to estimate the area \(\mu ^ { 2 }\) of a metal square. The independent random variables \(X _ { 1 }\) and \(X _ { 2 }\) are each distributed \(\mathrm { N } \left( \mu , \sigma ^ { 2 } \right)\) and represent two measurements of the sides of the square. Two estimators of the area, \(A _ { 1 }\) and \(A _ { 2 }\), are proposed where $$A _ { 1 } = X _ { 1 } X _ { 2 } \quad \text { and } \quad A _ { 2 } = \left( \frac { X _ { 1 } + X _ { 2 } } { 2 } \right) ^ { 2 } .$$ [You may assume that if \(X _ { 1 }\) and \(X _ { 2 }\) are independent random variables then $$\left. \mathrm { E } \left( X _ { 1 } X _ { 2 } \right) = \mathrm { E } \left( X _ { 1 } \right) \mathrm { E } \left( X _ { 2 } \right) \right]$$
  1. Find \(\mathrm { E } \left( A _ { 1 } \right)\) and show that \(\mathrm { E } \left( A _ { 2 } \right) = \mu ^ { 2 } + \frac { \sigma ^ { 2 } } { 2 }\).
  2. Find the bias of each of these estimators. The technician is told that \(\operatorname { Var } \left( A _ { 1 } \right) = \sigma ^ { 4 } + 2 \mu ^ { 2 } \sigma ^ { 2 }\) and \(\operatorname { Var } \left( A _ { 2 } \right) = \frac { 1 } { 2 } \sigma ^ { 4 } + 2 \mu ^ { 2 } \sigma ^ { 2 }\). The technician decided to use \(A _ { 1 }\) as the estimator for \(\mu ^ { 2 }\).
  3. Suggest a possible reason for this decision. A statistician suggests taking a random sample of \(n\) measurements of sides of the square and finding the mean \(\bar { X }\). He knows that \(\mathrm { E } \left( \bar { X } ^ { 2 } \right) = \mu ^ { 2 } + \frac { \sigma ^ { 2 } } { n }\) and \(\operatorname { Var } \left( \bar { X } ^ { 2 } \right) = \frac { 2 \sigma ^ { 4 } } { n ^ { 2 } } + \frac { 4 \sigma ^ { 2 } \mu ^ { 2 } } { n }\).
  4. Explain whether or not \(\bar { X } ^ { 2 }\) is a consistent estimator of \(\mu ^ { 2 }\).
View full question →
Construct combined estimator

A question is this type if and only if it asks to find coefficients for a weighted combination of estimators that is unbiased or has minimum variance.

1
4.0% of questions
Show example »
  1. Two organisations are each asked to carry out a survey to find out the proportion, \(p\), of the population that would vote for a particular political party.
The first organisation finds that out of \(m\) people, \(X\) would vote for this particular political party. The second organisation finds that out of \(n\) people, \(Y\) would vote for this particular political party. An unbiased estimator, \(Q\), of \(p\) is proposed where $$Q = k \left( \frac { X } { m } + \frac { Y } { n } \right)$$
  1. Show that \(k = \frac { 1 } { 2 }\) A second unbiased estimator, \(R\), of \(p\) is proposed where $$R = \frac { a X } { m } + \frac { b Y } { n }$$
  2. Show that \(a + b = 1\) Given that \(m = 100\) and \(n = 200\) and that \(R\) is a better estimator of \(p\) than \(Q\)
  3. calculate the range of possible values of \(a\) Show your working clearly.
View full question →
Derive MGF for discrete distribution

A question is this type if and only if it asks to obtain the MGF for a discrete random variable by summing e^(tx) times probabilities.

1
4.0% of questions
Show example »
2 The random variable \(X\) takes values \(- 2,0\) and 2 , each with probability \(\frac { 1 } { 3 }\).
  1. Write down the values of
    (A) \(\mu\), the mean of \(X\),
    (B) \(\mathrm { E } \left( X ^ { 2 } \right)\),
    (C) \(\sigma ^ { 2 }\), the variance of \(X\).
  2. Obtain the moment generating function (mgf) of \(X\). A random sample of \(n\) independent observations on \(X\) has sample mean \(\bar { X }\), and the standardised mean is denoted by \(Z\) where $$Z = \frac { \bar { X } - \mu } { \frac { \sigma } { \sqrt { n } } }$$
  3. Stating carefully the required general results for mgfs of sums and of linear transformations, show that the mgf of \(Z\) is $$M _ { Z } ( \theta ) = \left\{ \frac { 1 } { 3 } \left( 1 + e ^ { \frac { \theta \sqrt { 3 } } { \sqrt { 2 n } } } + e ^ { - \frac { \theta \sqrt { 3 } } { \sqrt { 2 n } } } \right) \right\} ^ { n } .$$
  4. By expanding the exponential functions in \(\mathrm { M } _ { Z } ( \theta )\), show that, for large \(n\), $$\mathrm { M } _ { Z } ( \theta ) \approx \left( 1 + \frac { \theta ^ { 2 } } { 2 n } \right) ^ { n }$$
  5. Use the result \(\mathrm { e } ^ { y } = \lim _ { n \rightarrow \infty } \left( 1 + \frac { y } { n } \right) ^ { n }\) to find the limit of \(\mathrm { M } _ { Z } ( \theta )\) as \(n \rightarrow \infty\), and deduce the approximate distribution of \(Z\) for large \(n\).
View full question →
MGF of sum of variables

A question is this type if and only if it asks to find the MGF of a sum of independent random variables by multiplying individual MGFs.

0
0.0% of questions
Verify MGF convergence condition

A question is this type if and only if it asks to state or explain why a condition like |t| < k is necessary for the MGF to exist.

0
0.0% of questions
Find variance of estimator

A question is this type if and only if it asks to calculate the variance of a proposed estimator (not using MGF).

0
0.0% of questions
MGF series expansion

A question is this type if and only if it asks to use the series expansion of e^x to extract moments from the MGF expression.

0
0.0% of questions