Teaching
Courses
Seminar: Physics-informed Machine Learning (2024-2025)
Final Course GitHub Repository
This seminar explored influential papers in physics-informed machine learning, covering topics like Gaussian process PDE solvers, Neural ODEs, Neural Operators, hybrid models, and foundational models for PDEs. Students prepared and presented self-contained tutorials based on selected papers, discussing how physical knowledge can be encoded in ML models and their limitations. The final tutorials are available in the linked GitHub repository.
Supervised student projects
Probabilistic extension of classic ODE/PDE solvers (2024) [Master Thesis]
This project investigated incorporating error estimates from classic ODE solvers (e.g., Runge-Kutta) into a probabilistic framework using Bayesian filtering (e.g., Extended Kalman Filter, Sigma-Point methods). The solver was treated as a black-box, focusing on utilizing its error output for tasks like trajectory smoothing and parameter estimation, bridging classical numerics with modern probabilistic techniques.
Uncertainty quantification for Small Large Language Models (2023-2024) [Bachelor Thesis]
This project explored the weight space curvature of small transformer models (minGPT) with a focus on the Laplace Approximation with the Generalized Gauss-Newton (GGN) matrix. Trained on Shakespearean text, the focus was on understanding weight posteriors and predictive sampling within a Bayesian context for language models.