Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
Lecture slides on probability theory, covering topics such as elementary and compound events, random variables, probability distributions, probability axioms, conditional probability, independent events, expected or mean value, moments, variance, central moments, standard deviation, covariance, correlation, and distributions like binomial, poisson, and continuous random variables.
Typology: Slides
1 / 19
Lecture: Probability Theory
Elementary Events: Those events that we can not further analyze into simpler events. For example, results (HEAD or TAIL) of a coin flipping are elementary events.
Compound or Composite Events: Those events that are defined from a number of elementary events. For example, results of flipping two coins. Random Events occur in nature. For example,
Probability is a measure of the likelihood that some event will occur. It is always in range of 0 and 1.
A probability distribution of random variable describes the probabilities associated with each possible value for the variable.
If there are fi outcomes out of n for an event i, then probability
p(E) = fi / n.
When we have continuous variable then probability density function is used.
Probability Axioms:
Another notation for the probability of event Ek is:
A property:
E 1 , E 2 ,E 3 , En
pk 0.
pk 1.
P{ Ek and/or Ej} pk pj
Two events Ei and Ej are mutually exclusive events if and only if the occurrence of Ei implies that Ej does not occur.
If two events Ek and Ej are mutually exclusive:
P{ E (^) k and Ej} 0
P{ Ek or Ej} pk pj
^1. j
A conditional probability is defined for independent events. It is needed when we have partial information concerning the experiment.
A conditional probability P( E | F ) is the probability that event E is occurring given that the event F has occurred.
Here P( E ∩ F ) represents the joint probability that both E and F occur together and P(F) is the probability that event F occurs.
We rearrange the equation as
When the occurrence of one event does not affect whether or not some other event happens, then these types of events are called independent.
If events are independent, then knowing that one event has occurred does not change our degree of belief that the other event occur.
Two events are said to be independent if and only if any of the following is true:
This can be extended to k events as
k
i
P E E E Ek P Ei 1
( 1 2 3 ) ( )
The expectation of a random variable x is the stochastic mean
value and is defined as E (x):
It is common to write this as < x >.
Consider a real-valued function:
Where xi are countable elementary events with probability pi.
If xi is a random variable then G(x i ) is also a random variable.
The expected value of g(x) is
j j
E (x) pj x
g (xi ) gi
( ( )) ( ) ( j) j
E g x g x pj g x
Example:
Flipping of a coin has been done. Assign 1 to the event of getting head and 0 to the event of getting tail. Using two different functions g(x) and h(x) let us find the expected values.
TAIL 1/2 0 1 1
HEAD 1/2 1 4 2
Events pi xi g(x) h(x)
x
x h x
1
1 3 ( )
g( x) 5 / 2 h( x) 3 / 2
From the definition of expected value of a function,
< constant > = constant For two functions g and h with two constants, < ag(x) + bh(x) > = a<g(x)> + b<h(x)>
For a linear function g(x): < g(x) > = g(
The nth moment of x is defined as n j j
j x n^ p x
j j The First moment of x is x^ pj^ x 2 2 j j
The second moment of x is x^ pj^ x
Moments & Variance
The Central Moments
The Central moments of x are defined as
The second central moment has a particular meaning:
This is also called variance of x.
n i
n
p x x
g x x x ( )
( ) ( )
2 2
2 2
( )^2 ( )^2
x x
p x x
x x p x x
i
i i
i
i i i
var{ x } x^2 x^2
The standard deviation of x is var{x}
Variance
For a general function g(x), the variance is
Two random variables g(x) and h(x) are independent if they derive from independent events. When two are independent then,
For independent g and h variables,
2 2
2 2
2
( ) ( )
var{ ( )} [ ( ) ( ) ]
g g
p g x p g x
g x g x g x
j j j j j j
var{ag (x)bh(x)}a^2 var{g(x)}b^2 var{h(x)}
The covariance is a measure of the independence of two random variables x and y:
Zero covariance does not imply independence of random variables. Another quantity related to covariance is the correlation coefficient:
It is equal to zero when x and y are independent. Also,
var{ }
cov{ , }^22 x
x x x x
var{ }var{ }
( , ) cov{ , } x y
x y x y
Its value is in between -1 and +1. Monte Carlo calculations try to take advantage of the negative correlation as a measure of reducing the variance.
Consider two events E0 and E1 that are mutually exclusive and exhaustive:
Binomial Probability Distribution Function
( 0 } 0 , 0.
{ 1 } , 1 .,
P E x
P E p x
The expected values for the real number x and its square are
( ).
( ) 1 ( 1 ) 0 , E x^2 p
E x p p p
The variance of x is
Suppose there are N independent samples of these events And each has either 0 or 1 outcome. Then probability of x Successes out of N is
( 1 ).
var{ } 2
2 2 p p p p
x x x
x N x x P{ X x} N^ C p ( 1 p)
The variance of x is var{^ x}^ np(^1 p)
The average or mean of x is
Binomial pdf:
Geometric pdf
Suppose we have to carry out a certain experiment repeatedly and independently where there are only two outcomes failure or success. If the outcome is failure then the experiment is repeated; otherwise we stop.
Now x is the number of experiments until success appears. Then
Where, q is the probability of failure in one experiment and p is the probability of success in one experiment.
2 1
1 ( 1 q )
x xq p p x
x
The variance of x is x^ p p var{ }^11 2
The Poisson Distribution
A random variable x is said to follow Poisson distribution, when
Where, l is a parameter of this distribution. It is easy to find that
x l
This distribution is fundamental in the theory of probability and stochastic processes. It is of great use in applications such as radioactive decay, queuing service systems and similar systems.
l
l
var{x}
n lt^ l
Continuous Random Variables
Consider the scattering of a photon by an atom. The angle at which the photon is scattered has values that are continuous between 0 and 180o.
Given that x is a real continuous random variable that varies from minus infinity to plus infinity. Then cumulative distribution is defined as
( ) ( ) 0 dx f x dF x
done, pdf = 0.
pdf = 1 in this interval.
pdf = 0.
F(x) = P{ a random selection of X gives a value less than } The probability density function (pdf) may be defined as
0.0 0.5 1.0 1.5 2. 0.
0.
0.
0.
0.
1.
1.
1. III
F (x) II
x
% How to generate uniform dis. % The domain is generated x = -1:0.1:11; % now the pdf for x values pdf = unifpdf(x, 0, 10); cdf = unifcdf(x, 0, 10); subplot(1,2,1),plot(x,pdf) title('pdf') xlabel('X'), ylabel('f(x)') axis([-1 11 0 0.2]) axis square subplot(1,2,2),plot(x,cdf) title('cdf') xlabel('X'), ylabel('f(x)') axis([-1 11 0 1.1]) axis square
MATLAB Program to Generate Uniform pdf & cdf
Expectations of Continuous Random Variables
The mean value of a continuous random variable in an interval [a, b] is
Where, f(x) is the probability density function (pdf) for x. The normalization condition is
The expected and variance value of any function of g(x) with this pdf are
b
a
b
a
E g g x f x dx
b
a