Summary
Hello and welcome to my portfolio! On this page, you will find details about my professional experience since the beginning of my career.
With more than 8 years of experience, I have strong expertise in cloud architecture and cloud development. Through various assignments, I have mainly worked with Python. However, I remain open to other languages. Passionate about new technologies, I regularly keep up with the latest trends. I am currently working on Agentic-based applications. Discovering serverless architectures during a mission was a turning point, and since then, I have been developing my personal applications following this approach. I am interested in any Back-End-oriented opportunity, ideally with a Cloud component.
Enjoy your visit!
My experience
Professional Experience
Solution Architect
March 2024 - Present
Valeo - Créteil
Within the InnoCore innovation team at Valeo Brain, I lead the Cloud evolution of the existing solution and contribute to designing new innovative features. My activities are mainly focused on cloud-native application development, but can be diverse (frontend, DevOps, live demos, filming, etc.).
-
Development and demonstration of the AssistXR solution
- Cloud architecture design (AWS Serverless: Lambda, API Gateway, Cognito, DynamoDB, S3)
- Frontend development in ReactJS with ThreeJS integration for a Digital Twin
- IoT pipeline setup using AWS IoT Core, Shadow, Kinesis Data Streams
- Solution presentations at CES Las Vegas, Paris Motor Show, AWS Summit
- Development of a demonstrative monitoring solution in Python
- Winner of the Innovation Automobile Awards Prize
-
Implementation of DevSecOps best practices
- CI/CD with GitHub Actions
- Code analysis and security with SonarQube and BlackDuck
- Infrastructure as Code with Terraform
-
Ongoing project
- AI Agent orchestration with CrewAI/ADK with AWS Bedrock and Google Gemini
Back-End Python Developer
November 2021 - March 2024
Renault - Guyancourt
Within Renault’s Resim ADAS R&D team, I was responsible for continuing the development and maintenance of existing services. The Resim ADAS team performs virtual resimulation of thousands of kilometers of driving data. These resimulations involve processing thousands of TB of data. The final goal is to improve the behavior of autonomous cars. These computations are distributed across a network of several dozen HPC clusters.
-
Development of a data injection software from scratch
- Development with an MVC architecture
- Rewrite of the software’s graphical interface using Tkinter
- Implementation of parallel injection using multiprocessing
- Storing injection statuses in a database
-
Maintenance of internal packages
- Automatic compression/decompression of files before upload/download
- Various bug fixes
-
Development of the geolocation labeling process
- Reading and formatting collected GPS data
- Detecting traversed Points Of Interest
AWS DevOps Engineer
September 2021 - November 2021
Danil.io - Paris
Danil is a SaaS platform for data visualization of chatbot performance. The application is hosted on AWS using a serverless approach.
I joined the team to implement a complete CI/CD pipeline.
-
Setup of a CI/CD pipeline from GitLab to AWS Cloud.
- CI: Validation of PEP8 standards
- Build of the application and storage of artifacts in S3
- Automatic deployment of the different serverless services
-
Writing the project infrastructure as code with SAM (Serverless Application Model)
- API Gateway configuration, Cognito (authentication)
- Lambdas and layers, with management of Development and Production environments.
Serverless Back-End Developer
July 2020 - February 2021
Las & Co
Las & Co is a ride-hailing company with a fleet of several hundred vehicles. To manage vehicle allocation to drivers, a web application was developed.
-
From-scratch setup of the app
- Writing the IaC file defining the project architecture
- API Gateway configuration, CORS management
- Oauth2 authentication setup with Cognito
- Writing the DatabaseManager connected to DynamoDB
- Development of CRUD lambdas for entities: Users, Drivers, Vehicles, Events, etc.
- Data validation using the Cerberus package
- Attachment manager to upload various file types (photos, admin documents, contracts, etc.) and store them in AWS S3
-
Implementation of endpoints for business process (workflow) management
- Start of activity: driver prospecting, signature appointment, contract signing, vehicle assignment
- Claims management: declaration, enrichment with damage photos, assignment of a partner garage, repair tracking and validation
- Payroll: payroll declaration, amount validation by management
- End of activity: leave request or end of activity by the driver, management validation, change of assigned vehicle status
- Task creation and assignment to a Las&Co employee at each workflow step
-
Application KPI calculations via a dedicated endpoint (Admin dashboard)
- Number of drivers / by status; Number of vehicles / by status; Number of available vehicles;
- Sum of drivers’ balances; Top10 drivers with highest/lowest balance
- Number of open/closed tasks / per day
- Generated revenue / expenses / by a defined period
-
Creation of automated notification crons with CloudWatch
- Status updates in case of delays for maintenance or inspection
- Weekly notifications for payroll launch, insurance renewals, inspection requests after breakdowns, etc.
-
Tooling, quality, project
- Use of AWS CI/CD services: CodePipeline pipelines calling CodeCommit/CodeBuild/CodeDeploy scripts written by the DevOps team
- Writing product and API documentation
- Facilitating daily meetings, assigning Jira tasks to backend developers
- Support and training on AWS for two other backend developers
Back-End Python Developer
February 2019 - April 2020
Crossquantum - Levallois-Perret
CrossQuantum is a startup acquired by SwissLife, developing a personal wealth aggregation mobile app: LaFinBox
-
Management of specialized news articles for LaFinBox users
- Backoffice setup with Python / Flask / MongoDB
- APIs for creating, editing, and publishing articles (~ 1/day)
- Publication permissions management based on user profile
- Bot integrating ~10 specialized articles from an sFTP
- XML parsing and cleaning (removing ads, formatting)
- Database integration to propose them for publication alongside manual creation
- Writing unit tests and Dockerfiles
-
Logkeeper: monitoring and tracking daily internal processing
- Development of a Python logging library, integrated into all batch jobs
- Bot aggregating logs and generating KPIs: success, failure, in progress, etc.
- Daily email sending to POs and application owners
-
Client data export for partners (insurance, investment banking)
- Python scripts and complex MongoDB queries (aggregation pipelines)
- KPI: number of users, views of partner promo articles, clicks on partner redirect links, etc.
- Daily automated sending to partners by email
-
CptFlam: balance alert manager
- Analysis of all accounts for all users
- Alerts based on user-defined thresholds: push notifications, emails
- Service refactoring to reduce processing time: 3h → 1h daily
- Service maintenance and creation of new alerts
-
Data visualization: platform usage statistics
- Grafana dashboards showing usage KPIs: connected users, web/mobile/OS, activity volume, etc.
- Maintenance of data aggregation pipelines: Python, InfluxDB
-
Miscellaneous
- Updating connectors to partner banking APIs as they evolved
- Peer programming sessions
- Agile methodology
Data Scientist (Internship)
February 2018 - July 2018
M6 Publicité - Neuilly-Sur-Seine
Within the IT department, I joined a Data Science team to run an audience prediction project for advertising slots using Machine Learning algorithms.
- Data extraction from a copy of production databases (2-year ad audience history) using SQL
- Cleaning and standardization using Python / Pandas
- Unsupervised analysis in R to better understand data and identify influential variables
- Supervised analysis (decision trees, neural networks) with Scikit-learn and TF to forecast audiences for the next 3 weeks
Back-End Java Developer (Internship)
April 2017 - August 2017
WeSmart - Brussels
WeSmart is a Brussels-based startup offering an energy monitoring solution for professionals, based on sensors and an aggregation platform.
- Development of a feature to import Excel consumption files into the platform using an Excel-to-object mapping library
- Development of a consumption forecasting module using Weka (Machine Learning library)
- Extension of platform documentation, writing API docs in YAML with Swagger
- Maintenance of existing services in Java Spring
- Writing Dockerfiles
Education
General Engineering Degree - Data Science option
2013 - 2018
Ecole Supérieure d'Ingénieurs Léonard de Vinci, Paris La Défense
- Software Development: Python, Java, C#, JavaScript
- Databases: MySQL, MongoDB, Elasticsearch
- Data Science: Machine Learning, Data analysis
Scientific Baccalaureate
2013
Lycée Joliot Curie, Nanterre
Honors
Services
Magnam dolores commodi suscipit. Necessitatibus eius consequatur ex aliquid fuga eum quidem. Sit sint consectetur velit. Quisquam quos quisquam cupiditate. Et nemo qui impedit suscipit alias ea. Quia fugiat sit in iste officiis commodi quidem hic quas.
Lorem Ipsum
Voluptatum deleniti atque corrupti quos dolores et quas molestias excepturi sint occaecati cupiditate non provident
Dolor Sitema
Minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat tarad limino ata
Sed ut perspiciatis
Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur
Magni Dolores
Excepteur sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit anim id est laborum
Nemo Enim
At vero eos et accusamus et iusto odio dignissimos ducimus qui blanditiis praesentium voluptatum deleniti atque
Eiusmod Tempor
Et harum quidem rerum facilis est et expedita distinctio. Nam libero tempore, cum soluta nobis est eligendi
Facts
Magnam dolores commodi suscipit. Necessitatibus eius consequatur ex aliquid fuga eum quidem. Sit sint consectetur velit. Quisquam quos quisquam cupiditate. Et nemo qui impedit suscipit alias ea. Quia fugiat sit in iste officiis commodi quidem hic quas.
Happy Clients consequuntur quae
Projects adipisci atque cum quia aut
Hours Of Support aut commodi quaerat
Hard Workers rerum asperiores dolor
Testimonials
Magnam dolores commodi suscipit. Necessitatibus eius consequatur ex aliquid fuga eum quidem. Sit sint consectetur velit. Quisquam quos quisquam cupiditate. Et nemo qui impedit suscipit alias ea. Quia fugiat sit in iste officiis commodi quidem hic quas.
My Technical Skills
Through my experience, I have worked with a wide range of technologies that helped me become more versatile.
Programming Languages
PythonFrameworks / Libraries
FlaskTools
GitInfrastructure / Cloud
AWSDatabases
MongoDBConcepts
MVC ArchitectureCertifications
- January 2018: TOEIC 875/990
- April 2021: AWS Cloud Practitioner 857/1000
- March 2023: AWS Developer Associate 849/1000
- March 2023: AWS Solution Associate 783/1000
Projects
Contact
I am open to any opportunity that matches my profile. Feel free to contact me by email or via the form below.
Location
Paris, La Défense
Phone number
+33 6 78 90 12 34