Random Variables - Probability - Solved Exam, Exams for Probability. Punjab Engineering College
davdas900
davdas90027 February 2013

Random Variables - Probability - Solved Exam, Exams for Probability. Punjab Engineering College

PDF (52.1 KB)
4 pages
1000+Number of visits
Description
This is the Solved Exam of Probability which includes Exponentially Distributed, Continuous, Random Variable, Expected, Positive Integer, Geometric Distribution, Probability, Geometrically, Distributed, Value etc. Key im...
20points
Download points needed to download
this document
Download the document
Preview3 pages / 4

Probability: Problem Set 6 Fall 2009

Instructor: W. D. Gillam

Due Nov. 6, start of class

Instructions. Print your name in the upper right corner of the paper and write “Problem Set 6” on the first line on the left. Skip a few lines. When you finish this, indicate on the second line on the left the amount of time you spent on this assignment and rate its difficulty on a scale of 1 − 5 (1 = easy, 5 = hard).

(1) Let X, Y be Cauchy distributed independent random variables. Calculate the density function for X + Y .

Solution. The density function for the Cauchy distribution is

f(x) = 1

π(1 + x2)

so the density for X + Y is given by the convolution product, which can be inte- grated by partial fractions:

(f ∗ f)(x) = ∫ ∞

−∞ f(y)f(x − y)dy

= 1

π2

∫ ∞

−∞

1

(1 + y2)

1

1 + (x − y)2 dy

= 1

π2

x(arctan y − arctan(x − y)) + ln 1+y2 1+(x−y)2

4x + x3

−∞

= 2

π(4 + x2)

= (1/2)f(x/2).

Note that the terms in the logarithm approach 1 as y → ±∞, so they drop out and we have used

lim y→±∞

arctan y = ±π 2 .

Notice that the density for X + Y is the same as the density for 2X in this case. This is not generally true, as we know from the Central Limit Theorem; the Cauchy distribution shows the necessity of the assumptions of finite expected value and variance (or at least some assumptions along these lines...) in the Central Limit Theorem. For example, even when X has a uniform distribution, X + Y will have a Simpson distribution even though 2X also has a uniform distribution.

(2) For n ∈ Z≥0, show that

Γ

(

n + 1

2

)

= (2n − 1)!!

2n √

π.

Hint: When n = 0, this is the Gaussian integral (note (−1)!! = 1 by convention). Induct on n and use the functional equation Γ(z + 1) = zΓ(z).

1

2

Solution. By induction, we compute, using the functional equation:

Γ(n + 1 + 1

2 ) = (n +

1

2 )Γ(n +

1

2 )

= (n + 12)(2n − 1)!!

2n √

π

= (2n + 1)(2n − 1)!!

2n+1 √

π

= (2n + 1)!!

2n+1 √

π.

(3) Let X be a continuous random variable with cumulative distribution function F (x) and density function f(x). Assume

∫ ∞

−∞ |x|f(x)dx

exists.1 Show that

lim x→+∞

x(1 − F (x)) = 0

lim x→−∞

xF (x) = 0.

Show that

E(X) :=

∫ ∞

−∞ xf(x)dx

=

∫ ∞

0 (1 − F (x))dx −

∫ 0

−∞ F (x)dx.

Draw a typical such F and shade the regions whose areas are given by these integrals. You may want to think about the analog of this when the expected value is replaced by the variance.

Solution. Since ∫ ∞

−∞ |x|f(x)dx

exists, we must have:

lim x→∞

∫ ∞

x yf(y)dy = 0

lim x→−∞

∫ x

−∞ −yf(y)dy = 0.

For positive x, we have

0 ≤ x(1 − F (x)) = x ∫ ∞

x f(y)dy

≤ ∫ ∞

x yf(y)dy.

1Technically speaking, one should assume this exists to even speak of the expected value, though I have not

really emphasized this much in class except when this arose in connection with the Cauchy distribution.

3

Taking the limit x → ∞ and applying the Squeeze Theorem to the above inequality yields

lim x→∞

x(1 − F (x)) = 0.

Similarly, for x < 0, we have

0 ≥ xF (x) = x ∫ x

−∞ f(y)dy

≥ ∫ x

−∞ yf(y)dy.

Taking the limit x → −∞ and applying the Squeeze Theorem to this inequality yields

lim x→−∞

xF (x) = 0

(the minus sign is no big deal since minus signs commute with limits and −0 = 0). For the second statement, use integration by parts to write:

∫ x

0 yf(y)dy =

∫ x

0 ydF (y)

= [yF (y)]x0 − ∫ x

0 F (y)dy

= xF (x) − ∫ x

0 F (y)dy

= −x(1 − F (x)) + ∫ x

0 (1 − F (y))dy

and similarly: ∫ 0

−x yf(y)dy =

∫ 0

−x ydF (y)

= [yF (y)]0−x − ∫ 0

−x F (y)dy

= xF (−x) − ∫ 0

−x F (y)dy.

Now add these two and take the limit x → ∞. Two of the four terms drop out by the previous part.

(4) Let f(x) be the density function for... (a) the standard normal distribution. (b) the gamma distribution with parameters α, β > 0. (c) the beta distribution with parameters α, β > 0. (d) Student’s distribution. Show that there are constants a, b, c, d ∈ R (depending on the parameters α, β for the beta and gamma distributions; you should calculate these constants in each case) such that

d

dx ln f(x) =

x + a

b + cx + dx2 .

4

Solution. We’ll just do the standard normal f(x) = 1√ 2pi

e−x 2/2 to give the idea.

d

dx ln f(x) =

f ′(x)

f(x) = −x.

This is of the above form with a = c = d = 0 and b = −1. (5) Let X, Y be independent random variables whose expected values and variances

exist. Prove that

V (XY ) = V (X)V (Y ) + E(X)2V (Y ) + E(Y )2V (X).

Solution. This is a straightforward computation with moments.

comments (0)
no comments were posted
be the one to write the first!
This is only a preview
See and download the full document
Docsity is not optimized for the browser you're using. In order to have a better experience please switch to Google Chrome, Firefox, Internet Explorer 9+ or Safari! Download Google Chrome