Machine Learning Course SS 14 U Stuttgart
DATE OF THE WRITTEN EXAM: August 26, 11:00-14:00, room V 55.02
-- Please always check the
Seiten des Prüfungsamts
See my general teaching page for previous versions of this lecture.
Exploiting large-scale data is a central challenge of our
time. Machine Learning is the core discipline to address this
challenge, aiming to extract useful models and structure from
data. Studying Machine Learning is motivated in multiple ways: 1) as
the basis of commercial data mining (Google, Amazon, Picasa, etc), 2)
a core methodological tool for data analysis in all sciences (vision,
linguistics, software engineering, but also biology, physics,
neuroscience, etc) and finally, 3) as a core foundation of autonomous
intelligent systems.
This lecture introduces to modern methods in Machine Learning,
including discriminative as well as probabilistic generative models. A
preliminary outline of topics is:
- motivation
- probabilistic modeling and inference
- regression and classification methods (kernel methods, Gaussian Processes, Bayesian kernel logistic regression, relations)
- discriminative learning (logistic regression, Conditional Random Fields)
- feature selection
- boosting and ensemble learning
- representation learning and embedding (kernel PCA and derivatives, deep learning)
- graphical models
- inference in graphical models (MCMC, message passing, variational)
- learning in graphical models
Students should bring basic knowledge of linear algebra, probability theory and
optimization.
- Organization
-
-
- This is the central website of the lecture. Link to slides, exercise sheets, announcements, etc will all be posted here.
- See the 01-introduction
slides for further information.
- Schedule, slides & exercises
-
- Literature
-
[1] The Elements of Statistical Learning: Data Mining, Inference, and Prediction
by Trevor Hastie, Robert Tibshirani and Jerome Friedman. Springer, Second Edition, 2009.
full online version available
(recommended: read introductory chapter)
[2] Pattern Recognition and Machine Learning
by Bishop, C. M.. Springer 2006.
online
(especially chapter 8, which is fully online)
[email by Stefan Otte:] This is a nice little (26 pages) linear
algebra and matrix calculus reference. It's used for the ML class in
Stanford. Maybe it's interesting for your ML class.
link
[email by Stefan Otte:]
Feature selection, l1 vs. l2 regularization, and rotational invariance
Paper:
link
Comments:
link
Recent Posts
pdf
Probabilities & Energy.
Die gängigen Erklärungen zu “Was ist Informatik?” – etwa von der
Gesellschaft für Infomatik,
der
TU Dresden,
oder auf Wikipedia –
machen es einem schwer, sic...