I am currently a Postdoc in Patrick Rebeschini’s group at University of Oxford funded by a Carlsberg Internationalisation Fellowships. Before this, I was a PhD student in the Foundations of Machine Learning Group, led by Kasper Green Larsen, whom I am grateful to have had as my advisor. I received a Bachelor’s degree from the Mathematics Department at Aarhus University and a Master’s degree in Statistics from the Statistics Department at Aarhus University. My primary research interest is learning theory.

Publications

I thank my brilliant co-authors for their insights and collaboration. All publications have authors ordered alphabetically, as is standard in the CS theory community.

2026

Agnostic Language Identification and Generation
Co-Author: Chirag Pabbaraju
Manuscript:lARXIVl

The Optimal Sample Complexity of Linear Contracts
Manuscript: lARXIVl

Sample-Near-Optimal Agnostic Boosting with Improved Running Time
Co-Authors: Arthur da Cunha, Andrea Paudice
Conference: lALT 2026l

2025

Revisiting Agnostic Boosting
Co-Authors: Arthur da Cunha, Andrea Paudice, Yuxin Sun
Conference: lNeurIPS 2025l

On Agnostic PAC Learning in the Small Error Regime
Co-Authors: Julian Asilis, Grigoris Velegkas
Conference: lNeurIPS 2025l - Spotlight top 3.12% of submissions

Uniform Mean Estimation for Heavy-Tailed Distributions via Median-of-Means
Co-Author: Andrea Paudice
Conference: lICML 2025l

Improved Margin Generalization Bounds for Voting Classifiers
Co-Author: Kasper Green Larsen
Conference: lCOLT 2025l

Understanding Aggregations of Proper Learners in Multiclass Classification
Co-Authors: Julian Asilis, Grigoris Velegkas
Conference: lALT 2025l

Efficient Optimal PAC Learning
Conference: lALT 2025l

2024

Majority-of-Three: The Simplest Optimal Learner?
Co-Authors: Ishaq Aden-Ali, Kasper Green Larsen, Nikita Zhivotovskiy
Conference: lCOLT 2024l

Optimal Parallelization of Boosting
Co-Authors: Arthur da Cunha, Kasper Green Larsen
Conference: lNeurIPS 2024l - Oral top 0.39% of submissions

The Many Faces of Optimal Weak-to-Strong Learning
Co-Authors: Kasper Green Larsen, Markus Engelund Mathiasen
Conference: lNeurIPS 2024l

Sparse Dimensionality Reduction Revisited
Co-Authors: Lior Kamma, Kasper Green Larsen, Jelani Nelson, Chris Schwiegelshohn
Conference: lICML 2024l

2023

The Fast Johnson-Lindenstrauss Transform Is Even Faster
Co-Authors: Ora Nova Fandina, Kasper Green Larsen
Conference: lICML 2023l

AdaBoost is not an Optimal Weak to Strong Learner
Co-Authors: Kasper Green Larsen, Martin Ritzert
Conference: lICML 2023 l - Oral top 2.37% of submissions

Barriers for Faster Dimensionality Reduction
Co-Authors: Ora Nova Fandina, Kasper Green Larsen
Conference: lSTACS 2023l

Optimally Interpolating between Ex-Ante Fairness and Welfare
Co-Authors: Panagiotis Karras, Wenyu Ma, Nidhi Rathi, Chris Schwiegelshohn
Manuscript: lARXIVl