Poisson Random Variable : Exercises

Introduction

The problems below all involve the Poisson random variable. Remember that the probablity mass function for this random variable is: $$ f_X(x) = P(X=x) = \frac{\lambda^x}{x!} e^{-\lambda} $$ where $\lambda$ is a parameter.

Example problems

Click on the problems to reveal the solution

Problem 1

To prove the distribution is normalised we need to show that $\sum_{x=0}^\infty f_X(x) = 1$. For the Poisson random variable we have: $$ \sum_{x=0}^\infty f_X(x) = \sum_{x=0}^\infty \frac{\lambda^x}{x!} e^{-\lambda} = e^{-\lambda} \sum_{x=0}^\infty \frac{\lambda^x}{x!} = e^{-\lambda} e^{\lambda} = 1 $$ The penultimate equality here holds because $\sum_{x=0}^\infty \frac{\lambda^x}{x!}$ is the McLaurin expansion of the exponential function.

Problem 2

The moment Generating function is given by $M_X(t) = \mathbb{E}(e^{tX})$. For the poisson random variable this is: $$ \begin{aligned} M_X(t) &= \sum_{x=0}^\infty e^{tx} P(X=x) \\ & = \sum_{x=0}^\infty e^{tx} \frac{\lambda^x}{x!} e^{-\lambda} \\ & = e^{-\lambda} \sum_{x=0}^\infty \frac{(\lambda e^t)^x}{x!} \\ & = e^{-\lambda} e^{\lambda e^t} = e^{\lambda(e^t - 1)} \end{aligned} $$ We now recall that we can find the first moment of the distribution (the expectation) by finding the value of the first derivative of the moment generating function at $t=0$. i.e. $$ \mathbb{E}(X) = \frac{\textrm{d}M_X(0)}{\textrm{d}t} $$ For the Poisson distribution this derivative is: $$ \frac{\textrm{d}M_X(t)}{\textrm{d}t} = \lambda e^t e^{\lambda(e^t - 1)} $$ which at $t=0$ gives: $\mathbb{E}(X) = \lambda e^0 e^{\lambda(e^0 - 1)} = \lambda$

To calculate the variance we need to first caclulate the second moment of the distribution. We can find this quantity from the moment generating function by finding the value of the second derivative at $t=0$. i.e. $$ \mathbb{E}(X^2) = \frac{\textrm{d}^2M_X(0)}{\textrm{d}t^2} $$ The second derivative for the moment generating function of the Poisson random variable is given by: $$ \frac{\textrm{d}^2M_X(t)}{\textrm{d}t^2} = \lambda e^t e^{\lambda(e^t - 1)} + \lambda^2 e^{2t} e^{\lambda(e^t - 1)} $$ At $t=0$ then $\mathbb{E}(X^2) = \lambda e^0 e^{\lambda(e^0 - 1)} + \lambda^2 e^{0} e^{\lambda(e^0 - 1)} = \lambda + \lambda^2$ The variance is thus $\textrm{var}(X) = \mathbb{E}(X^2) - [\mathbb{E}(X)]^2 = \lambda + \lambda^2 - \lambda^2 = \lambda$

Contact Details

School of Mathematics and Physics,
Queen's University Belfast,
Belfast,
BT7 1NN

Email: g.tribello@qub.ac.uk
Website: mywebsite