You are here

Lecture Notes

For his main regular lectures, Herbert Jaeger has written fully self-contained lecture notes, which anyone on this planet is invited to download and use.

Lecture Coverage
Formal languages and logic (undergraduate) Regular and context-free languages with their automata and grammars; first-order logic, mathematical basis of logic programming
Computability and complexity (undergraduate) Turing machines, random access machines, recursive functions, lambda calculus, undecidability theorems; complexity classes, hierarchy and cross-class theorems, NP-completeness, model-theoretic characterizations of complexity
Machine learning (undergraduate) Essentially the same as the graduate ML lecture notes (below) but simplified in some parts.
Machine learning (graduate) Bias-variance dilemma and curse of dimensionality; time series prediction and Takens theorem; essentials of probability and estimation theory; linear classifiers and RBF networks; K-means clustering; linear adaptive filters and LMS algorithm; multilayer perceptrons; recurrent neural networks; hidden Markov models and the EM algorithm
Algorithmical and statistical modeling (graduate) Essentials of probability and estimation theory; multivariate Gaussians; representing and estimating distributions by mixtures of Gaussians and Parzen windows; maximum likelihood estimation algorithms based on gradient descent and on EM; elementary and MCMC sampling methods, demo: constructing phylogentic trees; simulated annealing and energy interpretation of distributions, spin glass models, Boltzmann machines; Bayesian networks and graphical models, join-tree inference algorithm; introduction to fuzzy logic

Principles of Statisticsl Modeling (graduate and undergraduate 3rd year)

Part 1: Face to Face with Probability. Part 2 & 3: Practical probability and basics of statistical methods (by Prof. A. Wilhelm). Part 4: Machine Learning in a Tiny Nutshell.

This lecture combines two views on data modeling which are usually kept apart and even taught in different programs: 1. the "classical" statistical perspective which is the foundation for the social sciences, management sciences, and basic empirical research analysis in the natural sciences, 2. the machine learning view. Topics: rigorous and detailed (re-)introduction of basic probabilty theory concepts -- probability spaces, sigma-fields, random variables, distributions, samples. Basics of estimation theory -- loss, risk, admissible estimators, estimator properties. Basic distributions. Hypothesis testing. Curse of dimensionality, feature extraction, bias-variance tradeoff, regularization. Elementary feedforward neural networks.
Machine Learning: a general introduction for non-computer-scientists (transdisciplinary; 4-week course module) Richly illustrated low-math, low-tech introduction to machine learning, emphasis on neural networks, many examples
Boolean logic and some elements of computational complexity (transdisciplinary; 4-week course module) Step-by-step explanation of the very basic operations of digital information processing, plus a glance on how "computation" at large can be understood
Essentials of measure theory, integration, and probability (graduate tutorial, by Manjunath Gandhi) A "pocket guide"  summary of main definitions and theorems