Joint probability distribution
Joint probability distribution
If you have a pair of random variables $X$ and $Y$ the joint probability distribution function, $F_{XY}(x,y)$ tells you the following probability. $$ F_{XY}(x,y) = P( X \le x \wedge Y \le y ) $$ This function does not appear that often and it is more common to work with the joint probability mass function if the random variables are discrete: $$ f_{XY}(x,y) = P( X=x \wedge Y=y ) $$ You can calculate the marginal distribution, $f_X(x)$, for the random variable $X$ from the joint probability mass function using: $$ f_X(x) = P( X=x ) = \sum_{i=0}^\infty f_{XY}(x,y_i) $$ Similarly the marginal distribution, $f_Y(y)$, for the random variable $Y$ can be found using: $$ f_Y(y) = P( Y=y ) = \sum_{i=0}^\infty f_{XY}(x_i,y) $$
Syllabus Aims
- You should be able to explain the meanings of the joint probability mass function and joint probability distribution functions for a pair of random variables.
- You should be able to derive the marginal probability mass function for a discrete random variable from the joint probability mass function.
Description and link | Module | Author | ||
A description of joint probability functions including a derivation of the Cauchy Schwartz inequality. | SOR3012 | J. F. McCann |
Description and link | Module | Author | ||
Exercises involving joint probability distributions | SOR3012 | G. Tribello |
Contact Details
School of Mathematics and Physics,
Queen's University Belfast,
Belfast,
BT7 1NN
Email: g.tribello@qub.ac.uk
Website: mywebsite