DSME Colloquium: Markus Lange-Hegermann on Constraining Gaussian Processes

Friday, 18 November 2022

On Friday, 18 November 2022, at 10:00, we welcome Markus Lange-Hegermann, professor of mathematics and data science at TH Ostwestfalen-Lippe, at the DSME Colloquium where he will give a talk on "Constraining Gaussian Processes to Systems of (Ordinary and Partial) Linear Differential Equations", followed by an Q&A session. Please find the abstract of the talk and the biography of the speaker below.

This DSME Colloquium will be in a hybrid format, with the talk taking place at the DSME seminar room and streamed via Zoom. In order to receive the link to Zoom room, please subscribe to the DSME Colloquium mailing list  (by sending an email with subject "subscribe" to ). We will also announce future talks of the DSME Colloquium on this list.

If you want to attend the DSME colloquium in person, please contact our office at .


Constraining Gaussian Processes to Systems of (Ordinary and Partial) Linear Differential Equations


We consider Gaussian processes, a classical stochastic framework with recent fame in machine learning. We construct Gaussian process priors that concentrate their probability mass on solutions of certain linear differential equations. This yields a strong inductive bias and allows to construct precise models by combining differential equations with very few data points. Non-trivial applications of computer algebra are necessary to construct these covariance functions. We demonstrate the approach on several graphical examples.


Professor Markus Lange-Hegermann did his PhD (2014) about algorithmic differential algebra at RWTH Aachen. While working in industry, he fell in love with data science and machine learning. He is currently professor of math and data science (since 2018) at TH OWL and board member of the institute for industrial information technology (inIT). His research is centered on probabilistic machine learning. He is interested in industrial applications, often in data based modeling and subsequent optimization, in particular considering time dependencies, uncertainties, and physical knowledge given by differential equations. He loves organizing data science focused hackathons, seeing machine learning through algebraic lenses, and Bayes' theorem.