Work
  • Jan2023 - Current
    EDF - Engineering Manager - Data & AI Platform
    ENERGIE

    Mission: Directed cross-functional engineering teams (30+ engineers) to deliver 50+ SaaS, APIs, and workflows (SAFe agile).

    Accomplishments:

    • Defined and implemented AI/MLOps best practices: Versioning, tests, automation, monitoring, and reproducibility (feature stores and models registry with AWS S3 Tables, real-time tracking with MLflow).
    • Led Research and Integration of new AI/data technologies and frameworks into workflows (GenAI: Ollama code assist, Lakehouse, LLMOps, agentic design patterns).
    • Created and promoted best practices and training programs to facilitate rapid adoption of AI technologies across data and business units.
    • Defined tech roadmaps for AI/Data initiatives.
  • Jan2020 - Jan2023
    EDF - AI Data Platform Architect
    ENERGIE

    Mission: Led AI/Data platform design and build on OpenShift and AWS, enabling rapid delivery of production AI/Data applications (SAFe agile).

    Accomplishments:

    • Led data platform build with AWS S3 (storage), Airflow (orchestration), Kafka (streaming EDA), Redis (caching), DBT, and Gradio/Streamlit/FastAPI (APIs and interactive data apps).
    • Delivered reusable 12-factor compliant blueprints that standardized software delivery across teams.
    • Supervised DevOps implementation, with IaC (Terraform) and CI/CD pipelines (Jenkins, GitLab CI, Helm).
    • Established observability stack (ELK, Prometheus, Grafana) to oversee the applications cycle, ensuring reliability and monitoring.

    Result: Our platform now hosts over 50 applications (web apps, jobs, APIs), with dev squads delivering agile and secure solutions to business units (Time to market divided by 9).

    Day to day tasks:

    • Translate business requirements into secure, scalable, and reliable solution designs.
    • Develop proof of concepts and pilots, implement projects, and communicate new features to stakeholders.
    • Document and share best practice knowledge for new solutions.
    • Lead architectural design sessions and provide technical guidance throughout the project lifecycle.
  • Oct2017 - Jan2020
    EDF - Software Engineer
    Big Data Team

    Part of big data team, responsable of HADOOP Ecosystem.

    Mission : My role involves designing and implementing big data pipelines to support real-time analytics and business intelligence initiatives.

    Accomplishments :

    • 360° Customer view : Create a coherent and up-to-date view of customer experience. Extract and transform data using Apache Spark, load into Apache Phoenix, a high-performance relational database layer over HBase for low-latency queries.

    • Monitoring Application : Centralize logs of HADOOP processes into ElasticSearch (Build a module on top of log4j to integrate in oozie workflows), build a dashboard with KPIs (Kibana). Provided stakeholders with actionable insights for operational decision-making.

    My Day to day tasks :

    • Collaborate with cross-functional teams to define data requirements and architecture.
    • Design and develop data pipelines (ELT) from HDFS to HIVE, HBase, Elastic using Apache Spark to transform data.
    • Monitore data pipelines and manage incidents to ensure data integrity and availability.
  • Feb2017 - Sep2017
    SANOFI - Software Engineer
    Supply chain Team

    Improve Sanofi primary distribution center performance (working capital requirement WCR) with a focus on inventory levels and delivery lead times.

    • Build ETL data pipelines.
    • Reporting system development (KPI : QlikView, SAP Business Objects).
    • Demand forecasting : Build Machine learning model, Multivariate time series (Gradient Boosting - Python).
  • Nov2015 - Jul2016
    LIP6 - Research Engineer
    Multi agent systems Team

    Build Intelligent procurement model : Use Distributed Artificial Intelligence (Multi-Agent Systems), to address limitations (Bullwhip Effect) of traditional Supply Chain Coordination approaches (VMI, CPFR).

    • Multi agents system modeling (AUML, Gaia).
    • Algorithm : bargaining, Buyback and Revenue sharing combination (JADE : Java Agent DEvelopment Framework).