Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
An introduction to joint and marginal distributions for discrete and continuous random variables. It covers the definition of joint mass functions and density functions, properties, computations of probabilities and expectations, and the relationship between joint and marginal distributions. examples, exercises, and formulas.
What you will learn
Typology: Slides
1 / 5
We will now consider more than one random variable at a time. As we shall see, developing the theory of multivariate distributions will allow us to consider situations that model the actual collection of data and form the foundation of inference based on those data.
We begin with a pair of discrete random variables X and Y and define the joint (probability) mass function fX,Y (x, y) = P {X = x, Y = y}.
Example 1. For X and Y each having finite range, we can display the mass function in a table.
x 0 1 2 3 4 0 0. 02 0. 02 0 0. 10 0 1 0. 02 0. 04 0. 10 0 0 y 2 0. 02 0. 06 0 0. 10 0 3 0. 02 0. 08 0. 10 0 0. 05 4 0. 02 0. 10 0 0. 10 0. 05
As with univariate random variables, we compute probabilities by adding the appropriate entries in the table. P {(X, Y ) ∈ A} =
(x,y)∈A
f(X,Y )(x, y).
Exercise 2. Find
x,y fX,Y^ (x, y) = 1. The distribution of an individual random variable is call the marginal distribution. The marginal mass function for X is found by summing over the appropriate column and the marginal mass function for Y can be found be summing over the appropriate row.
fX (x) =
y
fX,Y (x, y), fY (y) =
x
fX,Y (x, y)
The marginal mass functions for the example above are
x fX (x) 0 0. 10 1 0. 30 2 0. 20 3 0. 30 4 0. 10
y fY (y) 0 0. 14 1 0. 16 2 0. 18 3 0. 25 4 0. 27
Exercise 3. Give two pairs of random variables with different joint mass functions but the same marginal mass functions.
The definition of expectation in the case of a finite sample space S is a straightforward generalization of the univarate case.
Eg(X, Y ) =
s∈S
g(X(s), Y (s))P {s}.
From this formula, we see that expectation is again a positive linear functional. Using the distributive property, we have the formula
Eg(X, Y ) =
x,y
g(x, y)fX,Y (x, y).
Exercise 4. Compute EXY in the example above.
For continuous random variables, we have the notion of the joint (probability) density function
fX,Y (x, y)∆x∆y ≈ P {x < X ≤ x + ∆x, y < Y ≤ y + ∆y}.
We can write this in integral form as
A
fX,Y (x, y) dydx.
The basic properties of the joint density function are
0 0.
0.4 0.
0.8^1
0
0.4 0.
0.8 0.
10
1
2
y x
fX,Y
(x,y)
Figure 1: Graph of density fX,Y (x, y) = 4(xy + x + y)/5, 0 ≤ x, y ≤ 1
−∞
−∞ fX,Y^ (x, y)^ dydx^ = 1.
Example 5. Let (X, Y ) have joint density
fX,Y (x, y) =
c(xy + x + y) for 0 ≤ x ≤ 1 , 0 ≤ y ≤ 1 , 0 otherwise.
Then ∫ (^) ∞
−∞
−∞
fX,Y (x, y) dydx =
0
0
c(xy + x + y) dydx
= c
0
xy^2 + xy +
y^2
0
dx = c
0
x +
dx
= c
x^2 +
x
0
5 c 4
and c = 4/ 5
0
∫ (^) x
0
(xy + x + y) dydx =^4 5
0
xy^2 + xy +^1 2
y^2
∣∣x 0
dx
0
x^3 +^3 2
x^2
dx =^4 5
x^4 +^1 2
x^3
0
The joint cumulative distribution function is defined as FX,Y (x, y) = P {X ≤ x, Y ≤ y}. For the case of continuous random variables, we have
FX,Y (x, y) =
∫ (^) y
−∞
∫ (^) x
−∞
fX,Y (s, t) dtds.
By two applications of the fundamental theorem of calculus, we find that
∂ ∂y
FX,Y (x, y) =
∫ (^) x
−∞
fX,Y (s, y) dt and ∂
2 ∂x∂y
FX,Y (x, y) = fX,Y (x, y).
Example 6. For the density introduced above,
FX,Y (x, y) =
∫ (^) y
0
∫ (^) x
0
(st + s + t) dtds =
∫ (^) y
0
st^2 + st +
t^2
∣∣y 0
ds
∫ (^) y
0
sy^2 + sy +
y^2
ds =
s^2 y^2 +
s^2 y +
sy^2
∣∣x 0
=
x^2 y^2 +
x^2 y +
xy^2
Notice that FX,Y (1, 1) = 1
P {X ≤ 1 2
The joint cumulative distribution function is right continuous in each variable. It has limits at −∞ and +∞ similar to the univariate cumulative distribution function.
Thus,
FX (x) =
∫ (^) x
−∞
−∞
fX,Y (s, t) dtds and FY (y) =
∫ (^) y
−∞
−∞
fX,Y (s, t) dsdt.
Now use the fundamental theorem of calculus to obtain the marginal densities.
fX (x) = F (^) X′ (x) =
−∞
fX,Y (x, t) dt and fY (y) = F (^) Y′ (y) =
−∞
fX,Y (s, y) ds.
Example 7. For the example density above, the marginal densities
fX (x) =
0
(xt + x + t) dt =
xt^2 + xt +
t^2
0
x +
and
fY (y) =^4 5
y +^1 2
The formula for expectation for jointly continuous random variables is dervied by discretizing X and Y , creating a double Rieman sum and taking a limit. This yields the identity
Eg(X, Y ) =
−∞
−∞
g(x, y)fX,Y (x, y) dydx.
Exercise 8. Compute EXY in the example above.
As in the one-dimensional case, we can give a comprehensive formula for expectation using Riemann- Steiltjes integrals
Eg(X, Y ) =
−∞
−∞
g(x, y) dFX,Y (x, y).
These can be realized as the limit of Riemann-Steiltjes sums
S(g, F ) =
∑^ m
i=
∑^ n
j=
g(xi, yj )∆FX,Y (xi, yj ).
Here, ∆F (xi, yj ) = P {xi < X ≤ xi + ∆x, yj < Y ≤ yj + ∆y}
Exercise 9. Show that
P {xi < X ≤ xi + ∆x, yj < Y ≤ yj } = FX,Y (xi + ∆x, yj + ∆y) − FX,Y (xi, yj + ∆y) − FX,Y (xi + ∆x, yj ) + FX,Y (xi + ∆x, yj + ∆y).