Graphical Models - Artificial Intelligence - Lecture Slides, Slides for Artificial Intelligence. West Bengal University of Animal and Fishery Sciences

Artificial Intelligence

Description: Some concept of Artificial Intelligence are Agents and Problem Solving, Autonomy, Programs, Classical and Modern Planning, First-Order Logic, Resolution Theorem Proving, Search Strategies, Structure Learning. Main points of this lecture are: Graphical Models, Conditional Independence, Bayesian Networks, Acyclic Directed Graph, Vertices, Edges, Markov Condition, Resolving Conflict, Paraconsistent Reasoning, Propagation
Showing pages  1  -  4  of  13
Introduction to Graphical Models
Part 2 of 2
Lecture 31 of 41
Graphical Models Overview [1]:
Bayesian Networks
P(20s, Female, Low, Non-Smoker, No-Cancer, Negative, Negative)
= P(T) · P(F) · P(L | T) · P(N | T, F) · P(N | L, N) · P(N | N) · P(N | N)
Conditional Independence
X is conditionally independent (CI) from Y given Z (sometimes written X Y | Z) iff
P(X | Y, Z) = P(X | Z) for all values of X, Y, and Z
Example: P(Thunder | Rain, Lightning) = P(Thunder | Lightning) T R | L
Bayesian (Belief) Network
Acyclic directed graph model B = (V, E, ) representing CI assertions over
Vertices (nodes) V: denote events (each a random variable)
Edges (arcs, links) E: denote conditional dependencies
Markov Condition for BBNs (Chain Rule):
Example BBN
 
iiin21 Xparents |XPX , ,X,XP
X1 X3
Cancer X6
Serum Calcium
Gender X7
Lung Tumor
   sDescendantNon
Methods for combining multiple beliefs
Theory more precise than for fuzzy, ANN inference
Data and sensor fusion
Resolving conflict (vote-taking, winner-take-all, mixture estimation)
Paraconsistent reasoning
Modeling process of evidential reasoning by updating beliefs
Source of parallelism
Natural object-oriented (message-passing) model
Communication: asynchronous dynamic workpool management problem
Concurrency: known Petri net dualities
Learning graphical dependencies from scores, constraints
Two parameter estimation problems: structure learning, belief revision
Fusion, Propagation, and Structuring
Bayesian Learning
Framework: Interpretations of Probability [Cheeseman, 1985]
Bayesian subjectivist view
A measure of an agent’s belief in a proposition
Proposition denoted by random variable (sample space: range)
e.g., Pr(Outlook = Sunny) = 0.8
Frequentist view: probability is the frequency of observations of an event
Logicist view: probability is inferential evidence in favor of a proposition
Typical Applications
HCI: learning natural language; intelligent displays; decision support
Approaches: prediction; sensor and data fusion (e.g., bioinformatics)
Prediction: Examples
Measure relevant parameters: temperature, barometric pressure, wind speed
Make statement of the form Pr(Tomorrow’s-Weather = Rain) = 0.5
College admissions: Pr(Acceptance) p
Plain beliefs: unconditional acceptance (p = 1) or categorical rejection (p = 0)
Conditional beliefs: depends on reviewer (use probabilistic model)
The preview of this document ends here! Please or to read the full document or to download it.
Document information
Embed this document:
Docsity is not optimized for the browser you're using. In order to have a better experience please switch to Google Chrome, Firefox, Internet Explorer 9+ or Safari! Download Google Chrome