- 4/26: Final Exam: May 22, @11:15-1:45
in Physics P118
- 4/26: May 14: Final review
- 4/26: May 7 poster session: boards, tacks, stands will be provided
- 4/26: On May 2, Prof. Alex Berg will give a guest lecture!
- 4/13: HW5 is due May 7
- 3/12: HW3 is out! (on BB, due Apr 4)
- 3/2: HW1 solutions uploaded on BB.
- 2/26: HW2 is out! (on BB, due Mar 12)
- 2/26 Midterm exam will take place on Mar 14, 5:30-6:50pm, in class.
- 2/22 Project ideas are out! (Project proposal due 2/28)
- 2/9: Recitation slides and code uploaded on Bb.
- 2/3: HW1 is out! (on BB, due Feb 26)
- 2/1: Wei will teach a recitation on Tue 2/5 during class time on Prob & Matlab (no class on Thu).
- 2/1: The (annotated) lecture notes are uploaded to BB *after* each class.
- 1/29: First lecture notes uploaded on BB under Documents.
- 1/19: Course timeline is out, see assignments.
- 1/2: Syllabus is now up!
- 12/28: If you are on the waiting list, email the instructor and you will be allowed to enroll if
there is space and you meet the pre-requisites.
Welcome to the class! Hope you will enjoy it :)
Instructor: Leman Akoglu
Teaching Assistant: Wei Liu
- Office: 1425 Computer Science
- Office hours: Tue 12 noon - 1:30 p.m.
- Email: invert (cs.stonybrook.edu @ leman)
- Office: 2110 Computer Science (2207 for late submissions)
- Office hours: Thu 2:30 p.m. - 4:00 p.m.
- Email: invert (cs.stonybrook.edu @ weiliu2)
Tue & Thu 5:30PM-6:50PM
PSYCHOLOGY A 137
We are drowning in information and starving for knowledge. — John Naisbitt
Machine Learning is centered around automated methods that improve their own performance through learning patterns in data, and then use the uncovered patterns to predict the future and make decisions.
Examples include document/image/handwriting classification, spam filtering, face/speech recognition, medical decision making, robot navigation, to name a few.
for an extended introduction.
This course covers the theory and practical algorithms for machine learning from a variety of perspectives.
The topics include Bayesian networks, decision tree learning, Support Vector Machines, statistical learning methods and unsupervised learning, as well as
theoretical concepts such as the PAC learning framework, margin-based learning, and VC dimension.
Short programming assignments include hands-on experiments with various learning algorithms, and a larger course project gives students a chance to dig into an area of their choice.
See the syllabus
This course is designed to give a graduate-level student a thorough grounding in the methodologies, technologies, mathematics and algorithms currently needed by people who do research in machine learning.
There is no official textbook for the course. I will post all the lecture notes and several readings on course website.
Below you can find a list of recommended reading. We will follow mostly the first book for the course.
- Christopher M. Bishop,
"Pattern Recognition and Machine Learning," Springer, 2011.
- Kevin P. Murphy,
"Machine Learning: a Probabilistic Perspective," The MIT Press, 2012. (optional)
- Tom Mitchell,
"Machine Learning," McGraw Hill, 1997. (optional)
- Ethem Alpaydin,
"Introduction to Machine Learning," The MIT Press, 2004. (optional)
- Trevor Hastie, Robert Tibshirani, Jerome Friedman
The Elements of Statistical Learning: Data Mining, Inference, and Prediction," FREE! (optional)
BULLETIN BOARD and other info
MISC - FUN:
Fake (ML) protest