Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Probability Theory: Lecture Slides on Events, Variables, Distributions, and Moments., Slides of Mathematical Modeling and Simulation

Lecture slides on probability theory, covering topics such as elementary and compound events, random variables, probability distributions, probability axioms, conditional probability, independent events, expected or mean value, moments, variance, central moments, standard deviation, covariance, correlation, and distributions like binomial, poisson, and continuous random variables.

Typology: Slides

2011/2012

Uploaded on 07/03/2012

jaee
jaee 🇮🇳

4.7

(22)

101 documents

1 / 19

Toggle sidebar

Related documents


Partial preview of the text

Download Probability Theory: Lecture Slides on Events, Variables, Distributions, and Moments. and more Slides Mathematical Modeling and Simulation in PDF only on Docsity!

Lecture Slides

on

Modeling and Simulation

Lecture: Probability Theory

Random Events

Elementary Events: Those events that we can not further analyze into simpler events. For example, results (HEAD or TAIL) of a coin flipping are elementary events.

Compound or Composite Events: Those events that are defined from a number of elementary events. For example, results of flipping two coins. Random Events occur in nature. For example,

  • Scattering of atoms from atoms,
  • Loss of population,
  • Percolation, rain,
  • random arrivals or departures in queues.

Probability

Probability is a measure of the likelihood that some event will occur. It is always in range of 0 and 1.

A probability distribution of random variable describes the probabilities associated with each possible value for the variable.

If there are fi outcomes out of n for an event i, then probability

p(E) = fi / n.

When we have continuous variable then probability density function is used.

Probability Axioms:

  • The probability of an event E must be between 0 and 1.
  • Sum of all probabilities for a sample sapce of experiment is one.
  • For mutually exclusive events A, B and C, the
  • P(A U B U C ) = p(A) + p (B) + p(C)

Random Events

  • Given an elementary event with a countable set of random outcomes,
  • There is associated with each possible outcome Ek a number called probability, pk such that :
  • If the kth outcome never occurs, then probability is zero.
  • If the kth outcome is sure to occur, then probability is one.

Another notation for the probability of event Ek is:

A property:

E 1 , E 2 ,E 3 ,  En

0  pk  1

pk  0.

pk  1.

P (Ek )  pk

P{ Ek and/or Ej} pk  pj

Probability

Two events Ei and Ej are mutually exclusive events if and only if the occurrence of Ei implies that Ej does not occur.

If two events Ek and Ej are mutually exclusive:

  1. When a whole class of events can be mutually exclusive for all i and j then all the possible events (exhaustive set) will have following property:
  2. A compound event for two elementary events will have probability function of both the probabilities of first and second events. It is called joint probability.

P{ E (^) k and Ej} 0

P{ Ek or Ej} pk  pj

 ^1. j

pj

Conditional Probability

A conditional probability is defined for independent events. It is needed when we have partial information concerning the experiment.

A conditional probability P( E | F ) is the probability that event E is occurring given that the event F has occurred.

Here P( E ∩ F ) represents the joint probability that both E and F occur together and P(F) is the probability that event F occurs.

We rearrange the equation as

; ( ) 0.
( )
( )
( | ) 
 P F
P F
P E F
P E F

P( EF)P(F)P(E|F).

Independent Events

When the occurrence of one event does not affect whether or not some other event happens, then these types of events are called independent.

If events are independent, then knowing that one event has occurred does not change our degree of belief that the other event occur.

Two events are said to be independent if and only if any of the following is true:

This can be extended to k events as

( ) ( | ).
( ) ( ) ( ),
P E P E F
P E F P E P F
 

 

    

k

i

P E E E Ek P Ei 1

( 1 2 3  ) ( )

Expected or Mean Value

The expectation of a random variable x is the stochastic mean

value and is defined as E (x):

It is common to write this as < x >.

Consider a real-valued function:

Where xi are countable elementary events with probability pi.

If xi is a random variable then G(x i ) is also a random variable.

The expected value of g(x) is

j j

E (x) pj x

g (xi ) gi

( ( )) ( ) ( j) j

E g x g x pj g x

Estimating Average Values

Example:

Flipping of a coin has been done. Assign 1 to the event of getting head and 0 to the event of getting tail. Using two different functions g(x) and h(x) let us find the expected values.

TAIL 1/2 0 1 1

HEAD 1/2 1 4 2

Events pi xi g(x) h(x)

g ( x)  1  3 x

x

x h x 

  1

1 3 ( )

 x 1 / 2

 g( x) 5 / 2  h( x) 3 / 2

Average Value and nth Moment

From the definition of expected value of a function,

< constant > = constant For two functions g and h with two constants, < ag(x) + bh(x) > = a<g(x)> + b<h(x)>

For a linear function g(x): < g(x) > = g()

The nth moment of x is defined as n j j

j x n^ p x

  

j j The First moment of x is  x^   pj^ x 2 2 j j

The second moment of x is x^ pj^ x

Moments & Variance

The Central Moments

The Central moments of x are defined as

The second central moment has a particular meaning:

This is also called variance of x.

n i

n

p x x

g x x x ( )

( ) ( )   

     

2 2

2 2

( )^2 ( )^2

  

  

      

x x

p x x

x x p x x

i

i i

i

i i i

var{ x } x^2  x^2

The standard deviation of x is    var{x}

Variance

For a general function g(x), the variance is

Two random variables g(x) and h(x) are independent if they derive from independent events. When two are independent then,

 g( x)h(x)    g(x)  h(x) 

For independent g and h variables,

2 2

2 2

2

( ) ( )

var{ ( )} [ ( ) ( ) ]

  

 

  

  

   

 

g g

p g x p g x

g x g x g x

j j j j j j

var{ag (x)bh(x)}a^2 var{g(x)}b^2 var{h(x)}

Covariance & Correlation

The covariance is a measure of the independence of two random variables x and y:

cov{x , y}  xy  x  y 

Zero covariance does not imply independence of random variables. Another quantity related to covariance is the correlation coefficient:

It is equal to zero when x and y are independent. Also,

var{ }

cov{ , }^22 x

x x x x 

   

var{ }var{ }

( , ) cov{ , } x y

 x y  x y

Its value is in between -1 and +1. Monte Carlo calculations try to take advantage of the negative correlation as a measure of reducing the variance.

Consider two events E0 and E1 that are mutually exclusive and exhaustive:

Binomial Probability Distribution Function

( 0 } 0 , 0.

{ 1 } , 1 .,  

  P E x

P E p x

The expected values for the real number x and its square are

( ).

( ) 1 ( 1 ) 0 , E x^2 p

E x p p p 

     

The variance of x is

Suppose there are N independent samples of these events And each has either 0 or 1 outcome. Then probability of x Successes out of N is

( 1 ).

var{ } 2

2 2 p p p p

x x x    

  

x N x x P{ X  x} N^ C p ( 1 p) 

 x Np

The variance of x is var{^ x}^ np(^1  p)

The average or mean of x is

Binomial pdf:

Geometric pdf

Suppose we have to carry out a certain experiment repeatedly and independently where there are only two outcomes failure or success. If the outcome is failure then the experiment is repeated; otherwise we stop.

Now x is the number of experiments until success appears. Then

P{ X  x}qx^1 p, n  1 , 2 , 3 ,,

Where, q is the probability of failure in one experiment and p is the probability of success in one experiment.

2 1

1 ( 1 q )

x xq p p x

x 

  

 

The variance of x is x^ p p var{ }^11  2 

The Poisson Distribution

A random variable x is said to follow Poisson distribution, when

Where, l is a parameter of this distribution. It is easy to find that

 x   l

This distribution is fundamental in the theory of probability and stochastic processes. It is of great use in applications such as radioactive decay, queuing service systems and similar systems.

 l

l 

var{x}

, 0 , 1 , 2 , 3 , ,
!
( )
{  } ^ n 
n
t
P X x e

n lt^ l

Continuous Random Variables

Consider the scattering of a photon by an atom. The angle at which the photon is scattered has values that are continuous between 0 and 180o.

Given that x is a real continuous random variable that varies from minus infinity to plus infinity. Then cumulative distribution is defined as

( )  ( )  0 dx f x dF x

  • F(x) = 0 for all x< 0. no sampling is

done, pdf = 0.

  • F(x) = x, for all 0 < x < 1

pdf = 1 in this interval.

  • F(x) = 0, for all x >1,

pdf = 0.

F(x) = P{ a random selection of X gives a value less than } The probability density function (pdf) may be defined as

0.0 0.5 1.0 1.5 2. 0.

0.

0.

0.

0.

1.

1.

1. III

F (x) II

x

% How to generate uniform dis. % The domain is generated x = -1:0.1:11; % now the pdf for x values pdf = unifpdf(x, 0, 10); cdf = unifcdf(x, 0, 10); subplot(1,2,1),plot(x,pdf) title('pdf') xlabel('X'), ylabel('f(x)') axis([-1 11 0 0.2]) axis square subplot(1,2,2),plot(x,cdf) title('cdf') xlabel('X'), ylabel('f(x)') axis([-1 11 0 1.1]) axis square

MATLAB Program to Generate Uniform pdf & cdf

Expectations of Continuous Random Variables

The mean value of a continuous random variable in an interval [a, b] is

Where, f(x) is the probability density function (pdf) for x. The normalization condition is

The expected and variance value of any function of g(x) with this pdf are

   

b

a

b

a

E (x) xdF(x) x f (x) dx

var{ g} E(g^2 ) E(g)^2

 (^ ) ^1  ()





f x dx F

E g g x f x dx

b

a

( )  ( ) ( )