Graphical Models - Artificial Intelligence - Lecture Slides, Slides for Artificial Intelligence. West Bengal University of Animal and Fishery Sciences

Artificial Intelligence

Description: Some concept of Artificial Intelligence are Agents and Problem Solving, Autonomy, Programs, Classical and Modern Planning, First-Order Logic, Resolution Theorem Proving, Search Strategies, Structure Learning. Main points of this lecture are: Graphical Models, Conditional Independence, Bayesian Networks, Acyclic Directed Graph, Vertices, Edges, Markov Condition, Resolving Conflict, Paraconsistent Reasoning, Propagation
Showing pages  1  -  2  of  13
Introduction to Graphical Models
Part 2 of 2
Lecture 31 of 41
Graphical Models Overview [1]:
Bayesian Networks
P(20s, Female, Low, Non-Smoker, No-Cancer, Negative, Negative)
= P(T) · P(F) · P(L | T) · P(N | T, F) · P(N | L, N) · P(N | N) · P(N | N)
Conditional Independence
X is conditionally independent (CI) from Y given Z (sometimes written X Y | Z) iff
P(X | Y, Z) = P(X | Z) for all values of X, Y, and Z
Example: P(Thunder | Rain, Lightning) = P(Thunder | Lightning) T R | L
Bayesian (Belief) Network
Acyclic directed graph model B = (V, E, ) representing CI assertions over
Vertices (nodes) V: denote events (each a random variable)
Edges (arcs, links) E: denote conditional dependencies
Markov Condition for BBNs (Chain Rule):
Example BBN
 
iiin21 Xparents |XPX , ,X,XP
X1 X3
Cancer X6
Serum Calcium
Gender X7
Lung Tumor
   sDescendantNon
The preview of this document ends here! Please or to read the full document or to download it.
Document information
Embed this document:
Docsity is not optimized for the browser you're using. In order to have a better experience please switch to Google Chrome, Firefox, Internet Explorer 9+ or Safari! Download Google Chrome