Profile 👋
I’m a final-year master’s student at École des Ponts · IP Paris, majoring in applied math.
I’m currently in the MVA M2 program at ENS Paris-Saclay, 2, 3, 4, 5 (Math, Learning, Vision).
I enjoy both theoretical and experimental arguments and my interests include:
Learning from experience (RL) · Learning representations · Transfer learning.
My recent focuses have been:
- Pre-training (Apertus) and Distillation (Falcon)
- Transformer architecture (FOG at NIPS’25).
- Overthinking in Reasoning Language Models (Terminator at ICLR’26)
- RL: exploration-exploitation in GRPO.
Since early 2025, I’ve joined MLO lab at EPFL as a research intern, under the supervision of Prof. Martin Jaggi. I am also part of the Swiss AI Initiative core LLM team, an open-source initiative between EPFL, ETHZ, and CSCS aiming to advance the foundations of LLMs.
Previously, I did an internship at the AI theory team of TII (UAE) and was member of the Falcon LLM team.
🛎️ News 🛎️:
- [March 2026] Our paper on tackling overthinking in RLMs is finally out.
- [Nov 2025] Excited to be presenting our poster on FOG at NeurIPS in Paris on November 25th.
- [Sept 2025] Apertus family of fully open LLMs is released: open-source FTW!
- [Dec 2024] Falcon3 family of open models is out: distillation FTW!
