Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Final Exam Review, Summaries of Machine Learning

Final Exam Review. 1. 10-601 Introduction to Machine Learning. Matt Gormley ... Carnegie Mellon University ... Sample Questions. 3. Overview.

Typology: Summaries

2022/2023

Uploaded on 05/11/2023

gangesha
gangesha 🇺🇸

4.6

(20)

239 documents

1 / 47

Toggle sidebar

Related documents


Partial preview of the text

Download Final Exam Review and more Summaries Machine Learning in PDF only on Docsity!

Final Exam Review

10 - 601 Introduction to Machine Learning

Matt Gormley Lecture 31 May 2, 2018 Machine Learning Department School of Computer Science Carnegie Mellon University

Reminders

  • Homework 9: Learning Paradigms
    • Out: Fri, Nov 30
    • Due: Fri, Dec 7 at 11:59pm

Outline

  1. Exam Logistics
  2. Sample Questions
  3. Overview

EXAM LOGISTICS

Final Exam

  • Time / Location
    • Time: Evening Exam Thu, Dec 13 at 1:00pm – 4:00pm
    • Room : We will contact each student individually with your room assignment. The rooms are not based on section.
    • Seats: There will be assigned seats. Please arrive early.
    • Please watch Piazza carefully for announcements regarding room / seat assignments.
  • Logistics
    • Format of questions:
      • Multiple choice
      • True / False (with justification)
      • Derivations
      • Short answers
      • Interpreting figures
      • Implementing algorithms on paper
    • No electronic devices
    • You are allowed to bring one 8½ x 11 sheet of notes (front and back)

Final Exam

  • How to Prepare
    • Attend (or watch) this final exam review session
    • Review prior year’s exams and solutions
      • We already posted these for the midterm
      • Disclaimer: This year’s 10-601 is not the same as prior

offerings, so review both midterm and final

  • Review this year’s homework problems
  • Consider whether you have achieved the “learning objectives” for each lecture / section
  • Attend the Final Exam Recitation ( Friday)

Final Exam

  • Advice (for during the exam)
    • Solve the easy problems first

(e.g. multiple choice before derivations)

  • if a problem seems extremely complicated you’re likely missing something
  • Don’t leave any answer blank!
  • If you make an assumption, write it down
  • If you look at a question and don’t know the answer:
  • we probably haven’t told you the answer
  • but we’ve told you enough to work it out
  • imagine arguing for some answer and see if you like it

Final Exam

  • Exam Contents
    • ~20% of material comes from topics covered before the midterm exam
    • ~80% of material comes from topics covered after the midterm exam

Topics covered before Midterm

  • Foundations
    • Probability, Linear Algebra, Geometry, Calculus
    • MLE
    • Optimization
  • Important Concepts
    • Regularization and Overfitting
    • Experimental Design
  • Classifiers
    • Decision Tree
    • KNN
    • Perceptron
    • Logistic Regression
      • Regression
        • Linear Regression
      • Feature Learning
        • Neural Networks
        • Basic NN Architectures
        • Backpropagation
      • Learning Theory
        • PAC Learning

Topics covered after Midterm

  • Learning Theory
    • PAC Learning
  • Generative Models
    • Generative vs. Discriminative
    • MLE / MAP
    • Naïve Bayes
  • Graphical Models
    • HMMs
    • Learning and Inference
    • Bayesian Networks
      • Reinforcement Learning
        • Value Iteration
        • Policy Iteration
        • Q-Learning
        • Deep Q-Learning
      • Unsupervised Learning
        • K-Means
        • PCA
      • Other Learning Paradigms
        • SVM (large-margin)
        • Kernels
        • Ensemble Methods / AdaBoost

SAMPLE QUESTIONS Material Covered Before Midterm Exam

Matching Game Goal: Match the Algorithm to its Update Rule

**1. SGD for Logistic Regression

  1. Least Mean Squares
  2. Perceptron** 4. 5. 6. A. 1=5, 2=4, 3= B. 1=5, 2=6, 3= C. 1=6, 2=4, 3= D. 1=5, 2=6, 3= E. 1=6, 2=6, 3=

Sample Questions

Sample Questions Dataset

Topographical Maps

Sample Questions Dataset

Sample Questions Dataset

Sample Questions Dataset

Robotic Farming Deterministic Probabilistic Classification (binary output) Is this a picture of a wheat kernel? Is this plant drought resistant? Regression (continuous output) How many wheat kernels are in this picture? What will the yield of this plant be?

Multinomial Logistic Regression polar bears sea lions sharks

Sample Questions

Handcrafted Features NNP : VBN NNP VBD LOC PER Egypt - born Proyas directed S NP VP ADJP NP VP egypt - born proyas direct p(y|x) ∝ exp(Θ y f

( )

) born-in

Example: Linear Regression x^25 y Goal: Learn y = w T f( x ) + b where f(.) is a polynomial basis function true “unknown” target function is linear with negative slope and gaussian noise