Derive standard distribution PGF

Prove from first principles that a named distribution (Binomial, Geometric, Poisson) has a specific PGF formula.

7 questions

CAIE Further Paper 4 2020 November Q5
5 marks
5 The random variable \(X\) has the binomial distribution \(\mathrm { B } ( n , p )\).
  1. Write down an expression for \(\mathrm { P } ( \mathrm { X } = \mathrm { r } )\) and hence show that the probability generating function of \(X\) is \(( \mathrm { q } + \mathrm { pt } ) ^ { \mathrm { n } }\), where \(\mathrm { q } = 1 - \mathrm { p }\).
  2. Use the probability generating function of \(X\) to prove that \(\mathrm { E } ( \mathrm { X } ) = \mathrm { np }\) and \(\operatorname { Var } ( \mathrm { X } ) = \mathrm { np } ( 1 - \mathrm { p } )\). [5]
CAIE Further Paper 4 2023 November Q5
5 The random variable \(X\) has the geometric distribution \(\operatorname { Geo } ( p )\).
  1. Show that the probability generating function of \(X\) is \(\frac { \mathrm { pt } } { 1 - \mathrm { qt } }\), where \(\mathrm { q } = 1 - \mathrm { p }\).
  2. Use the probability generating function of \(X\) to show that \(\operatorname { Var } ( X ) = \frac { \mathrm { q } } { \mathrm { p } ^ { 2 } }\).
    Kenny throws an ordinary fair 6-sided dice repeatedly. The random variable \(X\) is the number of throws that Kenny takes in order to obtain a 6 . The random variable \(Z\) denotes the sum of two independent values of \(X\).
  3. Find the probability generating function of \(Z\).
OCR S4 2007 June Q6
6 The discrete random variable \(X\) takes the values 0 and 1 with \(\mathrm { P } ( X = 0 ) = q\) and \(\mathrm { P } ( X = 1 ) = p\), where \(p + q = 1\).
  1. Write down the probability generating function of \(X\). The sum of \(n\) independent observations of \(X\) is denoted by \(S\).
  2. Write down the probability generating function of \(S\), and name the distribution of \(S\).
  3. Use the probability generating function of \(S\) to find \(\mathrm { E } ( S )\) and \(\operatorname { Var } ( S )\).
  4. The independent random variables \(Y\) and \(Z\) are such that \(Y\) has the distribution \(\mathrm { B } \left( 10 , \frac { 1 } { 2 } \right)\), and \(Z\) has probability generating function \(\mathrm { e } ^ { - ( 1 - t ) }\). Find the probability that the sum of one random observation of \(Y\) and one random observation of \(Z\) is equal to 2 .
OCR S4 2011 June Q1
1 The random variable \(X\) has the distribution \(\mathrm { B } ( n , p )\).
  1. Show, from the definition, that the probability generating function of \(X\) is \(( q + p t ) ^ { n }\), where \(q = 1 - p\).
  2. The independent random variable \(Y\) has the distribution \(\mathrm { B } ( 2 n , p )\) and \(T = X + Y\). Use probability generating functions to determine the distribution of \(T\), giving its parameters.
OCR S4 2012 June Q4
4 The random variable \(U\) has the distribution \(\operatorname { Geo } ( p )\).
  1. Show, from the definition, that the probability generating function ( pgf ) of \(U\) is given by $$G _ { U } ( t ) = \frac { p t } { 1 - q t } , \text { for } | t | < \frac { 1 } { q } ,$$ where \(q = 1 - p\).
  2. Explain why the condition \(| t | < \frac { 1 } { q }\) is necessary.
  3. Use the pgf to obtain \(\mathrm { E } ( U )\). Each packet of Corn Crisp cereal contains a voucher and \(20 \%\) of the vouchers have a gold star. When 4 gold stars have been collected a gift can be claimed. Let \(X\) denote the number of packets bought by a family up to and including the one from which the \(4 ^ { \text {th } }\) gold star is obtained.
  4. Obtain the pgf of \(X\).
  5. Find \(\mathrm { P } ( X = 6 )\).
OCR MEI S4 2007 June Q2
2 The random variable \(X\) has the binomial distribution with parameters \(n\) and \(p\), i.e. \(X \sim \mathrm {~B} ( n , p )\).
  1. Show that the probability generating function of \(X\) is \(\mathrm { G } ( t ) = ( q + p t ) ^ { n }\), where \(q = 1 - p\).
  2. Hence obtain the mean \(\mu\) and variance \(\sigma ^ { 2 }\) of \(X\).
  3. Write down the mean and variance of the random variable \(Z = \frac { X - \mu } { \sigma }\).
  4. Write down the moment generating function of \(X\) and use the linear transformation result to show that the moment generating function of \(Z\) is $$\mathrm { M } _ { Z } ( \theta ) = \left( q \mathrm { e } ^ { - \frac { p \theta } { \sqrt { n p q } } } + p \mathrm { e } ^ { \frac { q \theta } { \sqrt { n p q } } } \right) ^ { n } .$$
  5. By expanding the exponential terms in \(\mathrm { M } _ { Z } ( \theta )\), show that the limit of \(\mathrm { M } _ { Z } ( \theta )\) as \(n \rightarrow \infty\) is \(\mathrm { e } ^ { \theta ^ { 2 } / 2 }\). You may use the result \(\lim _ { n \rightarrow \infty } \left( 1 + \frac { y + \mathrm { f } ( n ) } { n } \right) ^ { n } = \mathrm { e } ^ { y }\) provided \(\mathrm { f } ( n ) \rightarrow 0\) as \(n \rightarrow \infty\).
  6. What does the result in part (v) imply about the distribution of \(Z\) as \(n \rightarrow \infty\) ? Explain your reasoning briefly.
  7. What does the result in part (vi) imply about the distribution of \(X\) as \(n \rightarrow \infty\) ?
OCR MEI S4 2010 June Q2
2 The random variable \(X\) has the Poisson distribution with parameter \(\lambda\).
  1. Show that the probability generating function of \(X\) is \(\mathrm { G } ( t ) = \mathrm { e } ^ { \lambda ( t - 1 ) }\).
  2. Hence obtain the mean \(\mu\) and variance \(\sigma ^ { 2 }\) of \(X\).
  3. Write down the mean and variance of the random variable \(Z = \frac { X - \mu } { \sigma }\).
  4. Write down the moment generating function of \(X\). State the linear transformation result for moment generating functions and use it to show that the moment generating function of \(Z\) is $$\mathrm { M } _ { Z } ( \theta ) = \mathrm { e } ^ { \mathrm { f } ( \theta ) } \quad \text { where } \mathrm { f } ( \theta ) = \lambda \left( \mathrm { e } ^ { \theta / \sqrt { \lambda } } - \frac { \theta } { \sqrt { \lambda } } - 1 \right)$$
  5. Show that the limit of \(\mathrm { M } _ { Z } ( \theta )\) as \(\lambda \rightarrow \infty\) is \(\mathrm { e } ^ { \theta ^ { 2 } / 2 }\).
  6. Explain briefly why this implies that the distribution of \(Z\) tends to \(\mathrm { N } ( 0,1 )\) as \(\lambda \rightarrow \infty\). What does this imply about the distribution of \(X\) as \(\lambda \rightarrow \infty\) ?