Let be an *n*-dimensional vector of random variables.

For all , the joint cumulative distribution function of X satisfies

Clearly it is straightforward to generalise the previous definition to join marginal distributions. For example, the join marginal distribution of and satisfies

If and is a partition of X then the conditional CDF of X_2 given X_1 satisfies

latex f_X (\bullet)$, then the conditional PDF of given satisfies

and the conditional CDF is then given by

where is the joint marginal PDF of which is given by

We next look at independence, which is something H2 Mathematics can easily relate to.

Here, we say the collection X is independent if joint CDF can be factored into the product of the marginal CDFs so that

If X has a PDF, then independence implies that the PDF also factories into the product of marginal PDFs so that

.

Using the above results, we have that if and are independent then

The above results tell us that having information about tells us nothing about .

Let’s continue to look further on the implications of independence.

Let X and Y be independent random variables. Then for any events A and B,

We can check this with,

In general, if are independent random variables then

Moreover, random variables can also be conditionally independent. For example, we say that X and Y are conditionally independent given Z if

.

The above will be used in the Gaussian copula model for pricing of collateralised debt obligation (CDO).

We let be the event that the bond in a portfolio defaults. Not reasonable to assume that the ‘s are independent though they are conditionally independent given Z so is often easy to compute.

Lastly, we consider the mean and covariance.

The mean vector of X is given by

and the covariance matrix of X satisfies

so the element of is simply the covariance of and .

The covariance matrix is symmetric and its diagonal elements satisfy and is also positive semi definite so that for all

Then the correlation matrix has element . This is also symmetric, postiie semi-definite and has 1’s along the diagonal.

For any matrix and vector ,

(distributes linearly)

Recall that if X and Y are independent, then , and the converse is not true in general.

[…] Multivariate Distributions […]