Lecture notes

All of our lecture notes have been specifically written for Jacobs courses and are detailed and comprehensive, such that they can completely replace external textbooks. Feel free to distribute with giving credits to the respective authors.

Lecture Coverage
Formal languages and logic (undergraduate) Regular and context-free languages with their automata and grammars; first-order logic, mathematical basis of logic programming
Computability and complexity (undergraduate) Turing machines, random access machines, recursive functions, lambda calculus, undecidability theorems; complexity classes, hierarchy and cross-class theorems, NP-completeness, model-theoretic characterizations of complexity
Machine learning (undergraduate) Curse of dimensionality and feature extraction, K-means clustering, linear regression, learning optimal decision functions, bias-variance dilemma, regularization, cross-validation, multi-layer perceptron, gradient descent optimization and the backprop algorithm, probability refresher
Machine learning (graduate) Bias-variance dilemma and curse of dimensionality; time series prediction and Takens theorem; essentials of probability and estimation theory; linear classifiers and RBF networks; K-means clustering; linear adaptive filters and LMS algorithm; multilayer perceptrons; recurrent neural networks; hidden Markov models and the EM algorithm
Algorithmical and statistical modeling (graduate) Essentials of probability and estimation theory; multivariate Gaussians; representing and estimating distributions by mixtures of Gaussians and Parzen windows; maximum likelihood estimation algorithms based on gradient descent and on EM; elementary and MCMC sampling methods, demo: constructing phylogentic trees; simulated annealing and energy interpretation of distributions, spin glass models, Boltzmann machines; Bayesian networks and graphical models, join-tree inference algorithm; introduction to fuzzy logic

Principles of Statistical Modeling (graduate)

These LN provide a detailed, example-rich, carefully explained, mathematically rigorous introduction to basic concepts of probability theory, not shying away from sigma-fields as most other introductory texts for non-mathematicians do. The different mindsets of requentist vs. Bayesian statistics are explained. Overview on uses of probability concepts in the natural sciences, signal processing, statistics, mathematics, and machine learning. Introduction to the approach to statistical thinking following the route laid out by J. C. Kiefer.
Machine Learning: a general introduction for non-computer-scientists (transdisciplinary; 4-week course module) Richly illustrated low-math, low-tech introduction to machine learning, emphasis on neural networks, many examples (annotated slides)
Boolean logic and some elements of computational complexity (transdisciplinary; 4-week course module) Step-by-step explanation of the very basic operations of digital information processing, plus a glance on how “computation” at large can be understood
Essentials of measure theory, integration, and probability (graduate tutorial, by Manjunath Gandhi) A “pocket guide”  summary of main definitions and theorems
 

extra.html