# Generating functions for joint distributions

by taratuta

on
Category: Documents
93

views

Report

#### Transcript

Generating functions for joint distributions
```30.13 GENERATING FUNCTIONS FOR JOINT DISTRIBUTIONS
As would be expected, X is uncorrelated with either W or Y , colour and face-value being
two independent characteristics. Positive correlations are to be expected between W and
Y and between X and Z; both correlations are fairly strong. Moderate anticorrelations
exist between Z and both W and Y , reﬂecting the fact that it is impossible for W and Y
to be positive if Z is positive. Finally, let us suppose that the random variables Xi , i = 1, 2, . . . , n, are related
to a second set of random variables Yk = Yk (X1 , X2 , . . . , Xn ), k = 1, 2, . . . , m. By
expanding each Yk as a Taylor series as in (30.137) and inserting the resulting
expressions into the deﬁnition of the covariance (30.133), we ﬁnd that the elements
of the covariance matrix for the Yk variables are given by
∂Yk ∂Yl Cov[Yk , Yl ] ≈
Cov[Xi , Xj ].
∂Xi
∂Xj
i
j
(30.140)
It is straightforward to show that this relation is exact if the Yk are linear
combinations of the Xi . Equation (30.140) can then be written in matrix form as
VY = SVX ST ,
(30.141)
where VY and VX are the covariance matrices of the Yk and Xi variables respectively and S is the rectangular m × n matrix with elements Ski = ∂Yk /∂Xi .
30.13 Generating functions for joint distributions
It is straightforward to generalise the discussion of generating function in section
30.7 to joint distributions. For a multivariate distribution f(X1 , X2 , . . . , Xn ) of
non-negative integer random variables Xi , i = 1, 2, . . . , n, we deﬁne the probability
generating function to be
Xn
1 X2
Φ(t1 , t2 , . . . , tn ) = E[tX
1 t2 · · · tn ].
As in the single-variable case, we may also deﬁne the closely related moment
generating function, which has wider applicability since it is not restricted to
non-negative integer random variables but can be used with any set of discrete
or continuous random variables Xi (i = 1, 2, . . . , n). The MGF of the multivariate
distribution f(X1 , X2 , . . . , Xn ) is deﬁned as
M(t1 , t2 , . . . , tn ) = E[et1 X1 et2 X2 · · · etn Xn ] = E[et1 X1 +t2 X2 +···+tn Xn ]
(30.142)
and may be used to evaluate (joint) moments of f(X1 , X2 , . . . , Xn ). By performing
a derivation analogous to that presented for the single-variable case in subsection
30.7.2, it can be shown that
E[X1m1 X2m2 · · · Xnmn ] =
∂m1 +m2 +···+mn M(0, 0, . . . , 0)
.
m2
mn
1
∂tm
1 ∂t2 · · · ∂tn
1205
(30.143)
```
Fly UP