Sunteți pe pagina 1din 9

EECE 5644: Course Overview Course Objectives (1)

• Syllabus • This course is an introductory course on


machine learning covering a range of
• Goals algorithms, focusing on the underlying models
• Evaluation behind each approach, to enable students to
learn where and how to apply machine learning
algorithms and why they work.
• The course also emphasizes on the foundations
to prepare students for research in machine
learning.

Lecture 1 Lecture 1
Dy 1 Dy 2

Evaluation
Course Objectives (2)
Class project (40%): This will be a project in an area of
your choosing, which is due at the end of the
semester.
• To provide you with a deeper knowledge about One Exam (30%)
a subtopic of your choosing. Homeworks and Computer Projects (30%):
In addition you will have improved and practiced Homeworks and computer projects will be assigned
every other week.
the following research skills:
• Writing a technical paper in your chosen area. You’ll be expected to participate in class discussion.
• Giving a technical presentation.
No Cheating

Lecture 1 Lecture 1
Dy 3 Dy 4
Handouts will be posted on Blackboard. Prerequisite
EECE 3468/MATH 3081 or equivalent
http://blackboard.neu.edu for undergraduates, EECE
7204/DS5020 or equivalent for graduate
students, knowledge of linear algebra
Homeworks are collected in class and
scanned versions also need to be
uploaded in Blackboard the same day Programming Requirement
the HW is due by 11:59 pm. C/C++/Python and Matlab

Lecture 1 Lecture 1
Dy 5 Dy 6

Recommended Textbook
Machine Learning
• Pattern Classification, Second Edition, by R. O. Duda, P. E.
Hart, D. Stork, Wiley and Sons, 2001. Tom Mitchell’s Definition of Learning
Supplementary Texts
A computer program is said to learn from experience
E with respect to some class of tasks T and
performance measure P, if its performance at tasks in
• Pattern Recognition and Machine Learning, by Christopher
T, as measured by P, improves with experience E.
M. Bishop, Springer 2006.

• Machine Learning: A Probabilistic Perspective, by Kevin P. Example: Learn to play checkers


Murphy, MIT Press 2012. T: play checkers and win
• The Elements of Statistical Learning : Data Mining,
P: % of games won in the world tournament
Inference, and Prediction, by T. Hastie, R. Tibshirani, J. H. E: play against self and others
Friedman, Springer, 2001.

Lecture 1 Lecture 1
Dy 7 Dy 8
Pattern Recognition Example Applications
“the act of taking in raw data and making
an action based on the `category’ of the
pattern” – Duda, Hart and Stork

Lecture 1 Lecture 1
Dy 9 Dy 10

Example

Lecture 1 Lecture 1
Dy 11 Dy 12
Represent each image or data point as a
feature vector:

Training Samples
brightness

Lecture 1 Lecture 1
Dy 13 Dy 14

brightness

homogeneity homogeneity

Lecture 1 Lecture 1
Dy 15 Dy 16
brightness
“Entia non sunt multiplicanda paeter
necessitatem”

“Entities are not to be multiplied without


necessity”
homogeneity
-- William of Occam

Problem:

Lecture 1 Lecture 1
Dy 17 Dy 18

brightness
Different tasks

homogeneity

Optimal trade-off

Lecture 1 Lecture 1
Dy 19 Dy 20
Design Cycle
Incorporating Knowledge

Lecture 1 Lecture 1
Dy 21 Dy 22

Machine Learning
Tom Mitchell’s Definition of Learning
A computer program is said to learn from experience
E with respect to some class of tasks T and
performance measure P, if its performance at tasks in
T, as measured by P, improves with experience E.

Example: Learn to play checkers


T: play checkers and win
P: % of games won in the world tournament
E: play against self and others

Lecture 1 Lecture 1
Dy 23 Dy 24
Knowledge Discovery in Databases An Overview of the KDD Process

The process of extracting valid,


previously unknown and ultimately
comprehensible information from large
databases.

U. Fayyad, G. Piatetsky-Shapiro, and P. Smyth (1995), Advances in


Knowledge Discovery and Data Mining

Lecture 1 Lecture 1
Dy 25 Dy 26

Learning from Data


• Supervised Learning:
Learning
each example is categorized/labelled, or Input, X Algorithm Prediction, Y
alternatively you receive feedback after each
decision.
Ex: Classification, Regression
• Reinforcement Learning:
feedback after a sequence of actions/decisions.
• Unsupervised Learning:
no feedback
Ex: Clustering - goal is to group data into similar
groups.
Lecture 1 Lecture 1
Dy 27 Dy 28
Supervised Learning Example Applications:
• Credit Risk Assessment
Given: Training examples <x,y=f(x)> for some – x: Properties of customer and proposed purchase
unknown function f. – f(x): approve purchase or not
• Disease Diagnosis
Find: A good approximation to f.
– x: Properties of patient (symptoms, lab tests)
– f(x): Disease (recommended therapy)
• Stock market prediction:
– x: Economic indicators (S&P, Dow Jones, interest rates)
– f(x): Whether it will go up or down
• Steering a vehicle
– x: Bitmap of road surface in front of car
– f(x): Degrees to turn the steering wheel

Lecture 1 Lecture 1
Dy 29 Dy 30

Tentative Class Outline


Unsupervised Learning Topics Date Reading
Introduction 9/6 Chapter 1
Bayesian Decision Theory 9/11, 9/13, 9/18 Chapter 2
Given: Training examples <x> for some (HW1)
unknown function f(x). Maximum-Likelihood and Bayesian 9/20, 9/25 (HW2), Chapter 3
Find: A good approximation to f. Parameter Estimation (ML, MAP, full 9/27
Bayesian), Naïve Bayes*, Bayesian
Networks*
Midterm Exam 10/16
Mixture Models (Mixture of 10/2, 10/4 Chapter 10.1-
Gaussians, EM algorithm) 10.4.2, 10.5
Unsupervised Learning: k-means 10/11, 10/18 Chapter 10
clustering, criterion functions,
hierarchical clustering
Model Selection 10/23 (HW3)
Support Vector Machines 10/25, 10/30 (PR) Chapter 5

Lecture 1 Lecture 1
Dy 31 Dy 32
Neural Networks, Deep Learning 11/1, 11/6 Chapter 6 Name:
(HW4), 11/8, E-mail:
11/13 (HW5)
Component Analysis, 11/15, 11/20 Chapter 3.8, 4
Department:
Discriminants, and feature selection Graduate/Undergraduate:
Decision Trees 11/27 Chapter 8.2-8.4
Year:
Area of Specialization/Interest:
Combining Classifiers: 11/29 Chapter 9
Bagging and Boosting
Special Topics 12/4, 6 Research adviser:
*Company affiliation if any:
Project Presentations (if needed) Final Exam Day
(Final Report) Related Courses:

Strength (Programming/Math):
Lecture 1 *Lecture
You 1may withhold this information.
Dy 33 Dy 34

S-ar putea să vă placă și