- A scientist is investigating the concentration of antibodies in the bloodstream of a patient following a vaccination.
The concentration of antibodies, \(x\), measured in micrograms ( \(\mu \mathrm { g }\) ) per millilitre ( ml ) of blood, is modelled by the differential equation
$$100 \frac { \mathrm {~d} ^ { 2 } x } { \mathrm {~d} t ^ { 2 } } + 60 \frac { \mathrm {~d} x } { \mathrm {~d} t } + 13 x = 26$$
where \(t\) is the number of weeks since the vaccination was given.
- Find a general solution of the differential equation.
Initially,
- there are no antibodies in the bloodstream of the patient
- the concentration of antibodies is estimated to be increasing at \(10 \mu \mathrm {~g} / \mathrm { ml }\) per week
- Find, according to the model, the maximum concentration of antibodies in the bloodstream of the patient after the vaccination.
A second dose of the vaccine has to be given to try to ensure that it is fully effective. It is only safe to give the second dose if the concentration of antibodies in the bloodstream of the patient is less than \(5 \mu \mathrm {~g} / \mathrm { ml }\). - Determine whether, according to the model, it is safe to give the second dose of the vaccine to the patient exactly 10 weeks after the first dose.