Hello, I'm Adrian Hayler

I am a Research Scientist at Prior Labs working on Tabular Foundation Models.

Before that I completed a MSc in Advanced Computer Science at the University of Oxford and a BSc in Mathematics at the Technical University of Munich (TUM). I was fortunate enough to work conduct research under some amazing mentors: Namely Daniel Cremers, Jakob Foerster, Micheal Bronstein, Christian Rupprecht and Ismail Ilkan Ceylan. Previously, I worked/interned at QuantCo and Jane Street.


News

  • December 2025: We have released DiscBench, a novel Open-Ended Benchmark For Algorithm Discovery.
  • November 2025: The TabPFN-2.5 model report is out on arXiv! It also received a spotlight at AITD@EurIPS 2025.
  • September 2025: Our paper ‘Bringing Graphs to the Table: Zero-shot Node Classification via Tabular Foundation Models’ has been accepted as an Oral presentation at NPGML@NeurIPS 2025!
  • Janurary 2024: S4C has been selected as a spotlight paper at 3DV 2024!
  • November 2023: We have released the code for S4C! We also include the predictions of other state-of-the-art methods on the SSCBench KITTI-360 dataset.
  • October 2023: Happy to announce that our paper ‘S4C: Self-Supervised Semantic Scene Completion with Neural Fields’ got accepted at 3DV 2024!

Publications

DiscoBench: An Open-Ended Benchmark For Algorithm Discovery

DiscoBench: An Open-Ended Benchmark For Algorithm Discovery

TL;DR: DiscoBench is an open-ended framework for evaluating automated algorithm discovery, e.g. via AI research agent systems. DiscoBench has a modular setup, an emphasis on discovering algorithms that transfer, and a huge diversity of tasks! We hope DiscoBench helps drive the frontier of research in algorithm discovery by providing a large-scale, open-ended landscape for evaluating AI research agents!

Blog Post | Code


TabPFN-2.5: Advancing the State of the Art in Tabular Foundation Models

TabPFN-2.5: Advancing the State of the Art in Tabular Foundation Models

arXiv, 2025 (also AITD@EurIPS 2025 Spotlight)

TL;DR: TabPFN-2.5 scales the number of cells TabPFN can process by 20x, significantly improves performance and runtime. At the time of writing TabPFN-2.5 tops TabArena, the industry standard benchmark for tabular learning.

Model Report | Code


Bringing Graphs to the Table: Zero-shot Node Classification via Tabular Foundation Models

Bringing Graphs to the Table: Zero-shot Node Classification via Tabular Foundation Models

NPGML@NeurIPS Oral, 2025

TL;DR: Tabular Foundation Models can be used as In-Context Learners for node classification, outperforming exisiting Graph Foundation Models and end-to-end baselines by a significant margin.

Paper | Code


S4C: Self-Supervised Semantic Scene Completion with Neural Fields

S4C: Self-Supervised Semantic Scene Completion with Neural Fields

3DV Spotlight, 2024

S4C is the first self-supervised approach to the Sematic Scence Completion task. It achives close to state-of-the-art performance on the KITTI-360 SSCBench dataset.

Project Page | Paper | Code