Joint distribution
From Wikipedia, the free encyclopedia
In the study of probability, given two random variables X and Y, the joint distribution of X and Y is the distribution of X and Y together.
Contents |
[edit] The discrete case
For discrete random variables, the joint probability mass function can be written as Pr(X = x & Y = y). This is
Since these are probabilities, we have
[edit] The continuous case
Similarly for continuous random variables, the joint probability density function can be written as fX,Y(x, y) and this is
fX,Y(x,y) = fY | X(y | x)fX(x) = fX | Y(x | y)fY(y)
where fY|X(y|x) and fX|Y(x|y) give the conditional distributions of Y given X = x and of X given Y = y respectively, and fX(x) and fY(y) give the marginal distributions for X and Y respectively.
Again, since these are probability distributions, one has
[edit] Joint distribution of independent variables
If for discrete random variables for all x and y, or for continuous random variables for all x and y, then X and Y are said to be independent.
[edit] Multidimensional distributions
The joint distribution of two random variables can be extended to many random variables X1, ..., Xn by adding them sequentially with the identity