Introduction to Graphical Models

Part 2 of 2

Lecture 31 of 41

Docsity.com

Graphical Models Overview [1]:

Bayesian Networks

P(20s, Female, Low, Non-Smoker, No-Cancer, Negative, Negative)

= P(T) · P(F) · P(L | T) · P(N | T, F) · P(N | L, N) · P(N | N) · P(N | N)

•Conditional Independence

–X is conditionally independent (CI) from Y given Z (sometimes written X Y | Z) iff

P(X | Y, Z) = P(X | Z) for all values of X, Y, and Z

–Example: P(Thunder | Rain, Lightning) = P(Thunder | Lightning) T R | L

•Bayesian (Belief) Network

–Acyclic directed graph model B = (V, E, ) representing CI assertions over

–Vertices (nodes) V: denote events (each a random variable)

–Edges (arcs, links) E: denote conditional dependencies

•Markov Condition for BBNs (Chain Rule):

•Example BBN

n

iiin21 Xparents |XPX , ,X,XP

1

X1 X3

X4

X5

Age

Exposure-To-Toxins

Smoking

Cancer X6

Serum Calcium

X2

Gender X7

Lung Tumor

sDescendantNon

Parents

sDescendant

Docsity.com

•Fusion

–Methods for combining multiple beliefs

–Theory more precise than for fuzzy, ANN inference

–Data and sensor fusion

–Resolving conflict (vote-taking, winner-take-all, mixture estimation)

–Paraconsistent reasoning

•Propagation

–Modeling process of evidential reasoning by updating beliefs

–Source of parallelism

–Natural object-oriented (message-passing) model

–Communication: asynchronous –dynamic workpool management problem

–Concurrency: known Petri net dualities

•Structuring

–Learning graphical dependencies from scores, constraints

–Two parameter estimation problems: structure learning, belief revision

Fusion, Propagation, and Structuring

Docsity.com

Bayesian Learning

•Framework: Interpretations of Probability [Cheeseman, 1985]

–Bayesian subjectivist view

•A measure of an agent’s belief in a proposition

•Proposition denoted by random variable (sample space: range)

•e.g., Pr(Outlook = Sunny) = 0.8

–Frequentist view: probability is the frequency of observations of an event

–Logicist view: probability is inferential evidence in favor of a proposition

•Typical Applications

–HCI: learning natural language; intelligent displays; decision support

–Approaches: prediction; sensor and data fusion (e.g., bioinformatics)

•Prediction: Examples

–Measure relevant parameters: temperature, barometric pressure, wind speed

–Make statement of the form Pr(Tomorrow’s-Weather = Rain) = 0.5

–College admissions: Pr(Acceptance) p

•Plain beliefs: unconditional acceptance (p = 1) or categorical rejection (p = 0)

•Conditional beliefs: depends on reviewer (use probabilistic model)

Docsity.com

##### Document information

Uploaded by:
shantii

Views: 527

Downloads :
0

University:
West Bengal University of Animal and Fishery Sciences

Subject:
Artificial Intelligence

Upload date:
29/04/2013