Responsive image
Photograph by Lutz Voigtlaender

Felix Voigtlaender

Professor for Mathematics — Reliable Machine Learning
at the Mathematical Institute for Machine Learning and Data Science
KU Eichstätt-Ingolstadt
Responsive image
Email: felix@voigtlaender.xyz
CV (last updated on 23 May 2020)


If you trust in yourself. . .and believe in your dreams. . .and follow your star. . . you'll still get beaten by people who spent their time working hard and learning things and weren't so lazy.

—Terry Pratchett, The Wee Free Men

New study program "Data Science" at the KU Eichstätt-Ingolstadt

In the winter semester 2022 (starting September), a new BSc study program "Data Science" will be established in Ingolstadt by the KU Eichstätt-Ingolstadt.
Please spread the word! If you have questions regarding the study program, please write me an email.
More information can be found here and here, and also at the DAAD homepage and the MIDS homepage.

Research interests

I am interested in the mathematics of machine learning, Functional analysis, and Harmonic analysis. More precisely, my main interests are
  • Neural networks and their properties, in particular expressiveness and approximation properties
  • Existence of adversarial examples and possible remedies
  • Information based complexity, in particular related to neural networks
  • Sampling
  • Solvability Complexity Index
  • Approximation theory
  • Function spaces and their embeddings
  • Multiscale systems (wavelets, curvelets, shearlets and generalizations)
  • Banach frames and atomic decompositions
  • Coorbit theory
  • Fourier analysis
As a mathematical hobby, I am interested in almost everything related to measure theory and in trying to crack toy problems like this one (which is solved by now).

Biography

Since November 2021, I am a professor of mathematics (professorship Reliable Machine Learning) at the newly founded Mathematical Institute for Machine Learning and Data Science of the KU Eichstätt-Ingolstadt. I am currently based in Eichstätt, but the institute and my group will move to Ingolstadt in 2022.

From June 2021 through October 2021, I was an Emmy Noether independent junior research group leader at the TU Munich, associated to the group of Felix Krahmer, funded through my project Stability and Solvability in Deep Learning.

From June 2020 until May 2021, I was a Senior Scientist in the Department of Mathematics at the University of Vienna, part of the group of Philipp Grohs.

Before that, from February 2018 to May 2020, I worked as an "Akademischer Rat" as part of the Scientific Computing Group at KU Eichstätt, lead by Götz Pfander.

From April 2016 until January 2018, I was a postdoctoral researcher at TU Berlin in the group of Gitta Kutyniok, where I worked on the DEDALE project. As part of that project, and with support by Anne Pein, I developed the DEDALE α-shearlet transform.

As a Ph.D. student, I worked in the group of Hartmut Führ at RWTH Aachen University, where I studied the approximation theoretic properties of different multiscale systems. With my Ph.D. thesis 'Embedding Theorems for Decomposition Spaces with applications to Wavelet Coorbit Spaces', I graduated with distinction in November 2015. In November 2016, I was awarded the Friedrich-Wilhelm-Award 2016 for my thesis.

I studied mathematics (Bachelor + Master) and computer science (Bachelor) at RWTH Aachen University, Germany, where I graduated with the Master degree with distinction in 2013.

I highly enjoy teaching mathematics and am committed to explaining carefully and presenting the material in an enjoyable way. At RWTH Aachen University, I have been teaching assistant for Analysis I and III, and for Harmonic analysis. Therefore, I am very proud to have received the Teaching award of the student council of mathematics at RWTH Aachen University.


News

01 November 2021

Update of the website. Added new information about my professorship at the KU Eichstätt-Ingolstadt.

11 June 2021

Update of the website. Added two preprints and updated my biography.

19 January 2021

Update of the website. Added two papers. In the first one, Andrei Caragea, Philipp Petersen, and I study the performance of neural networks for high-dimensional classification problems with structured class boundaries. In a nutshell, we show that if these boundaries are locally of Barron-type, then one obtains learning and approximation bounds with rates independent of the underlying dimension.
In the second one, I generalize the classical universal approximation theorem to the setting of complex-valued neural networks. Under very mild continuity assumptions on the activation function, I show for shallow networks that universality holds if and only if the real- or the imaginary part of the activation function is not polyharmonic. For networks with at least two hidden layers, universality holds if and only if the activation function is neither a polynomial, nor holomorphic, nor antiholomorphic.
Added two talks, one of which is available on Youtube.
Continued the endless fight against link rot.

23 May 2020

Update of the website and of the CV. Added several new preprints and talks. Updated photograph.

15 September 2018

Update of the website and of the CV. Added several new preprints and talks.

8 April 2018

First update of the website since I moved from Berlin to Eichstätt. The CV is still out of date, however...

23 November 2017

After my talk about the approximation properties of ReLU neural networks at the Research seminar "Mathematics of Computation", I was asked for the slides to my talk. These can be found here.

Many thanks to Tino Ullrich for inviting me to give this talk!

23 September 2017

I just added to my list of publications two new preprints that I recently uploaded to the arXiv.

The first one establishes a rather general version of Price's theorem: It gives a simple formula for computing the partial derivatives of the map ρ ↦ 𝔼[g(Xᵨ)], where Xᵨ is a normally distributed random variable with covariance matrix ρ. Price published this result in 1958, but did not precisely state the required assumptions on g. In the paper, I show that one can in fact take every tempered distribution g. This result is used in the paper ℓ¹-Analysis Minimization and Generalized (Co-)Sparsity: When Does Recovery Succeed? written by three of my colleagues.

The second preprint, written jointly with my colleague Philipp Petersen analyzes the approximation power of neural networks that use the ReLU activation function. We analyze how deep and wide such a neural network needs to be, in order to approximate any "piecewise smooth" function. We also show that these bounds are sharp.

I also added several talks that I gave in the last months, including the lecture notes for the lecture series "Sparsity Properties of Frames via Decomposition Spaces" that I gave at the Summer School on Applied Harmonic Analysis in Genoa.

09 February 2017

I just added a new preprint to my list of publications that I uploaded to the arXiv in December.

Furthermore, I am very happy to be a speaker at the Summer School on Applied Harmonic Analysis that will take place in Genoa from July 24-28, 2017.

Finally, I want to mention that my toy problem (about whether completeness of spaces can be characterized by the convergence of Neumann series) has been solved (negatively) quite a while ago.

28 July 2016

First version of this website was uploaded :)