The Gamma Distribution

6 questions · 11 question types identified

Sort by: Question count | Difficulty
Deriving moment generating function

A question is this type if and only if it asks to derive or show the moment generating function (mgf) of a gamma distribution or related distribution, typically using integration and substitution.

2 Challenging +1.0
33.3% of questions
Show example »
2 [In this question, you may use the result \(\int _ { 0 } ^ { \infty } u ^ { m } \mathrm { e } ^ { - u } \mathrm {~d} u = m\) ! for any non-negative integer \(m\).]
The random variable \(X\) has probability density function $$\mathrm { f } ( x ) = \begin{cases} \frac { \lambda ^ { k + 1 } x ^ { k } \mathrm { e } ^ { - \lambda x } } { k ! } , & x > 0 \\ 0 , & \text { elsewhere } \end{cases}$$ where \(\lambda > 0\) and \(k\) is a non-negative integer.
  1. Show that the moment generating function of \(X\) is \(\left( \frac { \lambda } { \lambda - \theta } \right) ^ { k + 1 }\).
  2. The random variable \(Y\) is the sum of \(n\) independent random variables each distributed as \(X\). Find the moment generating function of \(Y\) and hence obtain the mean and variance of \(Y\). [8]
  3. State the probability density function of \(Y\).
  4. For the case \(\lambda = 1 , k = 2\) and \(n = 5\), it may be shown that the definite integral of the probability density function of \(Y\) between limits 10 and \(\infty\) is 0.9165 . Calculate the corresponding probability that would be given by a Normal approximation and comment briefly.
View full question →
Verifying pdf normalization

A question is this type if and only if it asks to explain or deduce that a probability density function integrates to 1, often involving gamma function integrals.

1 Standard +0.3
16.7% of questions
Show example »
5 The continuous random variable \(X\) has probability density function given by $$\mathrm { f } ( x ) = \begin{cases} \frac { 1 } { ( \alpha - 1 ) ! } x ^ { \alpha - 1 } \mathrm { e } ^ { - x } & x \geqslant 0 \\ 0 & x < 0 \end{cases}$$ where \(\alpha\) is a positive integer.
  1. Explain how you can deduce that \(\int _ { 0 } ^ { \infty } x ^ { \alpha - 1 } \mathrm { e } ^ { - x } \mathrm {~d} x = ( \alpha - 1 )\) !.
  2. Write down an integral for the moment generating function \(\mathrm { M } _ { X } ( t )\) of \(X\) and show, by using the substitution \(x = \frac { u } { 1 - t }\), that \(\mathrm { M } _ { X } ( t ) = ( 1 - t ) ^ { - \alpha }\).
  3. Use the moment generating function to find, in terms of \(\alpha\),
    1. \(\mathrm { E } ( X )\),
    2. \(\operatorname { Var } ( X )\).
View full question →
Unbiased estimator verification

A question is this type if and only if it asks to show that a given estimator (often involving sample mean) is unbiased for a parameter of a gamma distribution.

1 Standard +0.8
16.7% of questions
Show example »
1 The random variable \(X\) has probability density function $$\mathrm { f } ( x ) = \frac { x \mathrm { e } ^ { - x / \lambda } } { \lambda ^ { 2 } } \quad ( x > 0 )$$ where \(\lambda\) is a parameter \(( \lambda > 0 ) . X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\) are \(n\) independent observations on \(X\), and \(\bar { X } = \frac { 1 } { n } \sum _ { i = 1 } ^ { n } X _ { i }\) is their mean.
  1. Obtain \(\mathrm { E } ( X )\) and deduce that \(\hat { \lambda } = \frac { 1 } { 2 } \bar { X }\) is an unbiased estimator of \(\lambda\).
  2. \(\operatorname { Obtain } \operatorname { Var } ( \hat { \lambda } )\).
  3. Explain why the results in parts (i) and (ii) indicate that \(\hat { \lambda }\) is a good estimator of \(\lambda\) in large samples.
  4. Suppose that \(n = 3\) and consider the alternative estimator $$\tilde { \lambda } = \frac { 1 } { 8 } X _ { 1 } + \frac { 1 } { 4 } X _ { 2 } + \frac { 1 } { 8 } X _ { 3 } .$$ Show that \(\tilde { \lambda }\) is an unbiased estimator of \(\lambda\). Find the relative efficiency of \(\tilde { \lambda }\) compared with \(\hat { \lambda }\). Which estimator do you prefer in this case?
View full question →
Method of moments estimation

A question is this type if and only if it asks to find an estimate of a parameter by equating the theoretical expectation to the sample mean or using method of moments.

1 Standard +0.3
16.7% of questions
Show example »
8 A random sample of 100 students were given a task and the time taken by each student to complete the task was recorded. The maximum time allowed to complete the task was one minute and all students completed the task within the maximum time. The times, \(T\) minutes, for the random sample of students are summarised as follows. \(n = 100 \quad \sum t = 61.88\) A researcher proposes that \(T\) can be modelled by the continuous random variable with probability density function \(f ( t ) = \begin{cases} \alpha t ^ { \alpha - 1 } & 0 \leqslant t \leqslant 1 , \\ 0 & \text { otherwise, } \end{cases}\) where \(\alpha\) is a positive constant. \section*{(a) In this question you must show detailed reasoning.} By finding \(\mathbf { E } ( T )\) according to the researcher's model, determine an approximation for the value of \(\alpha\). Give your answer correct to \(\mathbf { 3 }\) significant figures. Further information about the times taken for the sample of 100 students to complete the task is given in the table.
Time \(t\)\(0 \leqslant t < \frac { 1 } { 3 }\)\(\frac { 1 } { 3 } \leqslant t < \frac { 2 } { 3 }\)\(\frac { 2 } { 3 } \leqslant t \leqslant 1\)
Frequency183745
(b) Using the value of \(\alpha\) found in part (a), determine the extent to which the proposed model is a good model. (Do not carry out a goodness of fit test.)
View full question →
Computing expectation from pdf

A question is this type if and only if it asks to calculate E(X) directly from the probability density function using integration, without using mgf.

1 Standard +0.3
16.7% of questions
Show example »
11 The time, \(T\) years, before a particular type of washing machine breaks down may be taken to have probability density function f given by $$\mathrm { f } ( t ) = \begin{cases} a t \mathrm { e } ^ { - b t } & t > 0 \\ 0 & \text { otherwise } \end{cases}$$ where \(a\) and \(b\) are positive constants. It may be assumed that, if \(n\) is a positive integer, $$\int _ { 0 } ^ { \infty } t ^ { n } \mathrm { e } ^ { - b t } \mathrm {~d} t = \frac { n ! } { b ^ { n + 1 } }$$
  1. Records show that the mean of \(T\) is 1.5 . Show that \(b = \frac { 4 } { 3 }\) and find the value of \(a\).
  2. Find \(\operatorname { Var } ( T )\).
  3. Calculate \(\mathrm { P } ( T < 1.5 )\). State, giving a reason, whether this value indicates that the median of \(T\) is smaller than the mean of \(T\) or greater than the mean of \(T\).
View full question →
Finding moments from mgf

A question is this type if and only if it asks to use the moment generating function to find the expectation and/or variance of a distribution.

0
0.0% of questions
Sum of independent gamma variables

A question is this type if and only if it asks to find the mgf or distribution of a sum of independent identically distributed gamma random variables.

0
0.0% of questions
Stating pdf of sum

A question is this type if and only if it asks to write down or state the probability density function of a sum of gamma random variables.

0
0.0% of questions
Relating to chi-squared distribution

A question is this type if and only if it asks to identify or deduce that a transformation of a gamma variable follows a chi-squared distribution by comparing moment generating functions.

0
0.0% of questions
Variance of estimator

A question is this type if and only if it asks to calculate the variance of an estimator for a parameter of a gamma distribution.

0
0.0% of questions
Properties of estimators

A question is this type if and only if it asks to explain or discuss why an estimator is good, consistent, or appropriate for large samples based on its properties.

0
0.0% of questions