Find PGF from probability distribution

Given a discrete probability distribution (table, formula, or scenario like coin tosses or ball selection), construct the probability generating function as a polynomial or expression in t.

9 questions

CAIE Further Paper 4 2021 June Q6
6 Tanji has a bag containing 4 red balls and 2 blue balls. He selects 3 balls at random from the bag, without replacement. The number of red balls selected by Tanji is denoted by \(X\).
  1. Find the probability generating function \(\mathrm { G } _ { \mathrm { X } } ( \mathrm { t } )\) of \(X\).
    Tanji also has two coins, each biased so that the probability of obtaining a head when it is thrown is \(\frac { 1 } { 4 }\). He throws the two coins at the same time. The number of heads obtained is denoted by \(Y\).
  2. Find the probability generating function \(\mathrm { G } _ { Y } ( \mathrm { t } )\) of \(Y\).
    The random variable \(Z\) is the sum of the number of red balls selected by Tanji and the number of heads obtained.
  3. Find the probability generating function of \(Z\), expressing your answer as a polynomial.
  4. Use the probability generating function of \(Z\) to find \(E ( Z )\) and \(\operatorname { Var } ( Z )\).
    If you use the following lined page to complete the answer(s) to any question(s), the question number(s) must be clearly shown.
CAIE Further Paper 4 2020 June Q6
6 A bag contains 4 red balls and 6 blue balls. Rassa selects two balls at random, without replacement, from the bag. The number of red balls selected by Rassa is denoted by \(X\).
  1. Find the probability generating function, \(\mathrm { G } _ { \mathrm { X } } ( \mathrm { t } )\), of \(X\).
    Rassa also tosses two coins. One coin is biased so that the probability of a head is \(\frac { 2 } { 3 }\). The other coin is biased so that the probability of a head is \(p\). The probability generating function of \(Y\), the number of heads obtained by Rassa, is \(\mathrm { G } _ { Y } ( \mathrm { t } )\). The coefficient of \(t\) in \(\mathrm { G } _ { Y } ( \mathrm { t } )\) is \(\frac { 7 } { 12 }\).
  2. Find \(\mathrm { G } _ { Y } ( \mathrm { t } )\).
    The random variable \(Z\) is the sum of the number of red balls selected and the number of heads obtained by Rassa.
  3. Find the probability generating function of \(Z\), expressing your answer as a polynomial.
  4. Use the probability generating function of \(Z\) to find \(\mathrm { E } ( Z )\).
    If you use the following lined page to complete the answer(s) to any question(s), the question number(s) must be clearly shown.
OCR S4 2013 June Q5
5 The discrete random variable \(U\) has probability distribution given by $$\mathrm { P } ( U = r ) = \begin{cases} \frac { 1 } { 16 } \binom { 4 } { r } & r = 0,1,2,3,4
0 & \text { otherwise } \end{cases}$$
  1. Find and simplify the probability generating function (pgf) of \(U\).
  2. Use the pgf to find \(\mathrm { E } ( U )\) and \(\operatorname { Var } ( U )\).
  3. Identify the distribution of \(U\), giving the values of any parameters.
  4. Obtain the pgf of \(Y\), where \(Y = U ^ { 2 }\).
  5. State, giving a reason, whether you can obtain the pgf of \(U + Y\) by multiplying the pgf of \(U\) by the pgf of \(Y\).
OCR MEI S4 2008 June Q1
1 The random variable \(X\) has the Poisson distribution with parameter \(\theta\) so that its probability function is $$\mathrm { P } ( X = x ) = \frac { \mathrm { e } ^ { - \theta } \theta ^ { x } } { x ! } , \quad x = 0,1,2 , \ldots$$ where \(\theta ( \theta > 0 )\) is unknown. A random sample of \(n\) observations from \(X\) is denoted by \(X _ { 1 } , X _ { 2 } , \ldots , X _ { n }\).
  1. Find \(\hat { \theta }\), the maximum likelihood estimator of \(\theta\). The value of \(\mathrm { P } ( X = 0 )\) is denoted by \(\lambda\).
  2. Write down an expression for \(\lambda\) in terms of \(\theta\).
  3. Let \(R\) denote the number of observations in the sample with value zero. By considering the binomial distribution with parameters \(n\) and \(\mathrm { e } ^ { - \theta }\), write down \(\mathrm { E } ( R )\) and \(\operatorname { Var } ( R )\). Deduce that the observed proportion of observations in the sample with value zero, denoted by \(\tilde { \lambda }\), is an unbiased estimator of \(\lambda\) with variance \(\frac { \mathrm { e } ^ { - \theta } \left( 1 - \mathrm { e } ^ { - \theta } \right) } { n }\).
  4. In large samples, the variance of the maximum likelihood estimator of \(\lambda\) may be taken as \(\frac { \theta \mathrm { e } ^ { - 2 \theta } } { n }\). Use this and the appropriate result from part (iii) to show that the relative efficiency of \(\tilde { \lambda }\) with respect to the maximum likelihood estimator is \(\frac { \theta } { \mathrm { e } ^ { \theta } - 1 }\). Show that this expression is always less than 1 . Show also that it is near 1 if \(\theta\) is small and near 0 if \(\theta\) is large.
OCR MEI S4 2012 June Q2
2 The random variable \(X ( X = 1,2,3,4,5,6 )\) denotes the score when a fair six-sided die is rolled.
  1. Write down the mean of \(X\) and show that \(\operatorname { Var } ( X ) = \frac { 35 } { 12 }\).
  2. Show that \(\mathrm { G } ( t )\), the probability generating function (pgf) of \(X\), is given by $$\mathrm { G } ( t ) = \frac { t \left( 1 - t ^ { 6 } \right) } { 6 ( 1 - t ) }$$ The random variable \(N ( N = 0,1,2 , \ldots )\) denotes the number of heads obtained when an unbiased coin is tossed repeatedly until a tail is first obtained.
  3. Show that \(\mathrm { P } ( N = r ) = \left( \frac { 1 } { 2 } \right) ^ { r + 1 }\) for \(r = 0,1,2 , \ldots\).
  4. Hence show that \(\mathrm { H } ( t )\), the pgf of \(N\), is given by \(\mathrm { H } ( t ) = ( 2 - t ) ^ { - 1 }\).
  5. Use \(\mathrm { H } ( t )\) to find the mean and variance of \(N\). A game consists of tossing an unbiased coin repeatedly until a tail is first obtained and, each time a head is obtained in this sequence of tosses, rolling a fair six-sided die. The die is not rolled on the first occasion that a tail is obtained and the game ends at that point. The random variable \(Q ( Q = 0,1,2 , \ldots )\) denotes the total score on all the rolls of the die. Thus, in the notation above, \(Q = X _ { 1 } + X _ { 2 } + \ldots + X _ { N }\) where the \(X _ { i }\) are independent random variables each distributed as \(X\), with \(Q = 0\) if \(N = 0\). The pgf of \(Q\) is denoted by \(\mathrm { K } ( t )\). The familiar result that the pgf of a sum of independent random variables is the product of their pgfs does not apply to \(\mathrm { K } ( t )\) because \(N\) is a random variable and not a fixed number; you should instead use without proof the result that \(\mathrm { K } ( t ) = \mathrm { H } ( \mathrm { G } ( t ) )\).
  6. Show that \(\mathrm { K } ( t ) = 6 \left( 12 - t - t ^ { 2 } - \ldots - t ^ { 6 } \right) ^ { - 1 }\).
    [0pt] [Hint. \(\left. \left( 1 - t ^ { 6 } \right) = ( 1 - t ) \left( 1 + t + t ^ { 2 } + \ldots + t ^ { 5 } \right) .\right]\)
  7. Use \(\mathrm { K } ( t )\) to find the mean and variance of \(Q\).
  8. Using your results from parts (i), (v) and (vii), verify the result that (in the usual notation for means and variances) $$\sigma _ { Q } { } ^ { 2 } = \sigma _ { N } { } ^ { 2 } \mu _ { X } { } ^ { 2 } + \mu _ { N } \sigma _ { X } { } ^ { 2 } .$$
OCR MEI S4 2016 June Q1
1 The random variable \(X\) has a Cauchy distribution centred on \(m\). Its probability density function ( pdf ) is \(\mathrm { f } ( x )\) where $$\mathrm { f } ( x ) = \frac { 1 } { \pi } \frac { 1 } { 1 + ( x - m ) ^ { 2 } } , \quad \text { for } - \infty < x < \infty$$
  1. Sketch the pdf. Show that the mode and median are at \(x = m\).
  2. A sample of size 1 , consisting of the observation \(x _ { 1 }\), is taken from this distribution. Show that the maximum likelihood estimate (MLE) of \(m\) is \(x _ { 1 }\).
  3. Now suppose that a sample of size 2 , consisting of observations \(x _ { 1 }\) and \(x _ { 2 }\), is taken from the distribution. By considering the logarithm of the likelihood function or otherwise, show that the MLE, \(\hat { m }\), satisfies the cubic equation $$\left( 2 \hat { m } - \left( x _ { 1 } + x _ { 2 } \right) \right) \left( \hat { m } ^ { 2 } - \left( x _ { 1 } + x _ { 2 } \right) \hat { m } + 1 + x _ { 1 } x _ { 2 } \right) = 0$$
  4. Obtain expressions for the three roots of this equation. Show that if \(\left| x _ { 1 } - x _ { 2 } \right| < 2\) then only one root is real. How do you know, without doing further calculations, that in this case the real root will be the MLE of \(m\) ?
  5. Obtain the three possible values of \(\hat { m }\) in the case \(x _ { 1 } = - 2\) and \(x _ { 2 } = 2\). Evaluate the likelihood function for each value of \(\hat { m }\) and comment on your answer.
CAIE Further Paper 4 2020 Specimen Q6
6 Aish h sab g co ainig 3 red b lls ad 3 wh te b lls. Sb selects a b ll at rach , b es its cb o ad return it to th b g th same p o ess is rep ated twice mo e. Tb m brd red b lls selected b Aish is d no edy \(X\).
  1. Fid \(\mathbf { b } \mathbf { p b }\) b lityg \(\mathbf { a }\) ratig \(\mathbf { a }\) ting \({ } _ { X } ( t ) \boldsymbol { 6 } X\). Basan also s sab g co ain g 3 red balls ad 3 wh te b lls. He selects th ee b lls at rach , with rep acemen, frm hsbg Th m brg red lls selectedB asan is d n edy \(Y\).
  2. Fid \(\mathbf { b } \mathbf { p } \mathbf { b }\) b litys \(\mathbf { a }\) ratif \(\mathbf { a }\) ting \({ } _ { Y } ( t ) \underset { \text { b } } { } Y\). Th rad \(m\) riab e \(Z\) is to to alm brø reb lls selected y Aish adB asan.
  3. Fid \(\mathbf { b } \mathbf { p } \mathbf { b }\) b lityg \(\mathbf { e }\) ratig \(\mathbf { u }\) tim \(Z\), essig as wer as \(\mathrm { p } \mathbf { p }\) ial. [β
  4. Use th p b b lityg \(\mathbf { B }\) ratig u tim \(Z\) tof idE ( \(Z\) ) ad \(\operatorname { Var } ( Z )\). [\$ If B e th follw ig lin dpg to cm p ete th an wer(s) to ay q stin (s), th q stin \(\mathrm { m } \quad \mathbf { b } \quad \mathrm { r } ( \mathrm { s } )\) ms tb clearlys n n
OCR S4 2018 June Q6
6 In each round of a quiz a contestant can answer up to three questions. Each correct answer scores 1 point and allows the contestant to go on to the next question. A wrong answer scores 0 points and the contestant is allowed no further question in that round. If all 3 questions are answered correctly 1 bonus point is scored, making a total score of 4 for the round. For a certain contestant, \(A\), the probability of giving a correct answer is \(\frac { 3 } { 4 }\), independently of any other question. The random variable \(X _ { r }\) is the number of points scored by \(A\) during the \(r ^ { \text {th } }\) round.
  1. Find the probability generating function of \(X _ { r }\).
  2. Use the probability generating function found in part (i) to find the mean and variance of \(X _ { r }\).
  3. Write down an expression for the probability generating function of \(X _ { 1 } + X _ { 2 }\) and find the probability that \(A\) has a total score of 4 at the end of two rounds.
Edexcel FS1 2022 June Q6
  1. The discrete random variable \(V\) has probability distribution
\(v\)234
\(\mathrm { P } ( V = v )\)\(\frac { 9 } { 25 }\)\(\frac { 12 } { 25 }\)\(\frac { 4 } { 25 }\)
  1. Show that the probability generating function of \(V\) is $$\mathrm { G } _ { V } ( t ) = t ^ { 2 } \left( \frac { 2 } { 5 } t + \frac { 3 } { 5 } \right) ^ { 2 }$$ The discrete random variable \(W\) has probability generating function $$\mathrm { G } _ { W } ( t ) = t \left( \frac { 2 } { 5 } t + \frac { 3 } { 5 } \right) ^ { 5 }$$
  2. Use calculus to find
    1. \(\mathrm { E } ( W )\)
    2. \(\operatorname { Var } ( W )\) Given that \(V\) and \(W\) are independent,
  3. find the probability generating function of \(X = V + W\) in its simplest form. The discrete random variable \(Y = 2 X + 3\)
  4. Find the probability generating function of \(Y\)
  5. Find \(\mathrm { P } ( Y = 15 )\)