Education
- M2 recherche - MVA: Ecole Normale SupΓ©rieure P-S β Paris, 2025β2026 π«π·
- Major: Math, Learning, Vision
- MEng (French eng. diploma): ENPC β Paris, 2022β2025 π«π·
- Major: Applied Math and Machine Learning
- Distinction: Government excellence scholarship
- PrΓ©pa (Undergraduate): Esprit PrΓ©pa β Tunis, 2020β2022 πΉπ³
- Major: Math & Physics
- Distinction: Valedictorian
- High School: Pioneer School β Gafsa, 2016β2020 πΉπ³
- Distinction: Valedictorian
Experience
- Research Intern: Feb 25 β Now π¨π
- Lab: Machine Learning and Optimization (MLO) at EPFL
- Supervisor: Prof. Martin Jaggi
- Focus: efficient pre-training, Transformers, knowledge distillation, reasoning, GRPO
- Contribution (so far): paper on robust transformer architectures, published at NeurIPS'25
- Swiss AI Initiative Member: Feb 25 β Now π¨π
- Definition: Initiative between EPFL and ETHZ, powered by CSCS ALPS
- Role: Core member of the LLM team
- Focus: mainly pre-training
- Contribution (so far): Apertus - Fully open-source, compliant, and multilingual LLMs (8B, 70B)
- Research Intern: July 24 β Jan 25 π¦πͺ
- Lab: AI theory team of Technology Innovation Institute
- Supervisor: Dr. MEA Seddik
- Focus: Knowledge Distillation, scalable optimization and parametrization, LLM pre-training
- Contribution: Falcon3 family of open-weight LMs [1b, 3b, 7b, 10b] proves transfer learning efficiency (SOTA performance on many tasks)
Undisclosed technical report...
More (CV)
Download my PDF resume (Dec 2025)