Education


  • M2 recherche - MVA: Ecole Normale SupΓ©rieure P-S – Paris, 2025–2026 πŸ‡«πŸ‡·
    • Major: Math, Learning, Vision
  • MEng (French eng. diploma): ENPC – Paris, 2022–2025 πŸ‡«πŸ‡·
    • Major: Applied Math and Machine Learning
    • Distinction: Government excellence scholarship
  • PrΓ©pa (Undergraduate): Esprit PrΓ©pa – Tunis, 2020–2022 πŸ‡ΉπŸ‡³
    • Major: Math & Physics
    • Distinction: Valedictorian
  • High School: Pioneer School – Gafsa, 2016–2020 πŸ‡ΉπŸ‡³
    • Distinction: Valedictorian

Experience


  • Research Intern: Feb 25 – Now πŸ‡¨πŸ‡­
    • Lab: Machine Learning and Optimization (MLO) at EPFL
    • Supervisor: Prof. Martin Jaggi
    • Focus: efficient pre-training, Transformers, knowledge distillation, reasoning, GRPO
    • Contribution (so far): paper on robust transformer architectures, published at NeurIPS'25
  • Swiss AI Initiative Member: Feb 25 – Now πŸ‡¨πŸ‡­
    • Definition: Initiative between EPFL and ETHZ, powered by CSCS ALPS
    • Role: Core member of the LLM team
    • Focus: mainly pre-training
    • Contribution (so far): Apertus - Fully open-source, compliant, and multilingual LLMs (8B, 70B)
  • Research Intern: July 24 – Jan 25 πŸ‡¦πŸ‡ͺ
    • Lab: AI theory team of Technology Innovation Institute
    • Supervisor: Dr. MEA Seddik
    • Focus: Knowledge Distillation, scalable optimization and parametrization, LLM pre-training
    • Contribution: Falcon3 family of open-weight LMs [1b, 3b, 7b, 10b] proves transfer learning efficiency (SOTA performance on many tasks)
      Undisclosed technical report...

More (CV)


Download my PDF resume (Dec 2025)