Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Modern Statistical Methods for Machine Learning and Data-Driven Research, Assignments of Statistics

This master's and ph.d. Level course, taught by dr. Maggie myers, covers advanced statistical modeling tools for researchers in machine learning, data mining, computational biology, engineering, psychology, and other fields. Topics include probability and bayes rule, random variables and vectors, bayesian networks, expectation, variance, and the central limit theorem, estimation and distributions of estimators, bayesian modeling, maximum likelihood estimation, markov models, simulations using bootstrapping, markov chain monte carlo methods, and gibbs sampling.

Typology: Assignments

Pre 2010

Uploaded on 08/26/2009

koofers-user-2xe
koofers-user-2xe 🇺🇸

4

(1)

10 documents

1 / 1

Toggle sidebar

Related documents


Partial preview of the text

Download Modern Statistical Methods for Machine Learning and Data-Driven Research and more Assignments Statistics in PDF only on Docsity! SSC 385 Modern Statistic Methods Instructor: Dr.
Maggie
Myers Intended audience: Masters
and
Ph.D.
students
in
machine
learning,
data
mining,
computational
 biology,
engineering,
psychology,
and
other
fields
in
need
of
advanced
modeling
tools
in
their
 research. Methods of evaluation: Homework,
participation,
midterm,
and
projects.

Assignments
will
be
 made
throughout
the
semester
and
will
generally
be
due
a
week
after
assigned.

A
final
project
 involves
giving
a
brief
presentation
of
some
subtopic
of
interest.

 Optional TEXT: DeGroot,
Morris
H.
and
Mark
J.
Schervish,
Probability
and
Statistics,
3rd
ed.

 Addison
Wesley,
2002. This course is intended to illustrate current approaches in modeling and computation in statistics. Background: Probability
and
Bayes
Rule

 Random
Variables
and
Vectors:
Discrete
and
Continuous
 Using
Bayesian
Networks
 Expectation,
Variance,
and
the
Central
Limit
Theorem
 Inference: Estimation
and
Distributions
of
Estimators
 Bayesian
Modeling,
Prior
and
Posterior
Distributions

 MLE/MAP/EM
Parameter
Estimation
 Testing
Hypotheses
 Models: Linear
Statistical
Models
 Mixture
models

 Markov
Models
/
Random
Walks
 Hidden
Markov
Models Simulations: Bootstrapping

 Markov
Chain
Monte
Carlo
Methods
 Randomized
Sampling
algorithms
(Monte
Carlo)
 
 Metropolis
Hastings
Algorithms
 Gibbs
Sampling
 These concepts will be illustrated using applications drawn from machine learning, computational biology, and other interests of the students enrolled. Students will apply techniques to data sets using R and other appropriate packages.