8 Project complete
A data scientist on a mission to explore the world and tick off a carefully curated bucket list—executing goals without hitting any syntax errors.
8 Project complete
Paris, France
AI services integration & quality monitoring: Integrated multiple AI services into the existing quality monitoring SaaS tool, delivering evaluation and analysis for all client interaction channels: chat, email, and audio call.
LLM-agnostic architecture: Migrated the application to an LLM-agnostic architecture for flexibility and vendor independence.
GDPR / EU AI Act & responsible AI: Ensured compliance with GDPR and the EU AI Act through data protection measures, access controls, and responsible AI practices, including PII redaction, secure data processing, and audit logging for LLM operations.
Benchmarking framework: Developed and integrated a benchmarking framework to analyze and evaluate the quality of AI outputs.
Root cause analysis: Developed a root cause analysis model for client interactions to identify and explain quality issues.
LLM security & input guardrails: Strengthened system security by implementing prompt injection detection and input guardrails for LLM-based services.
MCP server & RAG knowledge agent: Developed an MCP server exposing all platform APIs and built a RAG-based agent for internal knowledge retrieval and operational efficiency.
Paris, France
Computable Contracts platform entry: Developed a production-ready module for automatic parsing and modeling of complex insurance contracts, forming the entry point for AXA's AI-powered Computable Contracts platform.
Parser benchmarking & GPT-4o: Benchmarked and evaluated over 25 document parsers, leading to the adoption of a state-of-the-art multimodal LLM approach (GPT-4o) for robust handling of unstructured legal documents and nested tables.
Clean Architecture & scalability: Refactored the workflow into a modular codebase using Clean Architecture, enabling maintainable, testable, and scalable development for future enterprise deployment.
Graph databases & contract intelligence: Used graph databases (Kuzu, Neo4j) to map relationships in contracts, making it easier to run smart searches and analysis.
Evaluation pipeline & reproducibility: Engineered a comprehensive evaluation pipeline and custom metrics (NID Score, TEDS Score); documented all research and experiments in a GitHub wiki to ensure reproducibility and support cross-team learning.
Containers, observability & CI/CD: Containerized all project components for reliable deployment and introduced observability with Langfuse to optimize LLM prompt engineering, supported by automated CI/CD pipelines and team-based code reviews.
Agile delivery & knowledge transfer: Collaborated daily in an agile team environment, delivered regular demos, and contributed to rigorous documentation and knowledge transfer practices for project continuity.
Churn Model: Reduced customer turnover by 15% by leveraging telecom data and predictive modeling to identify at-risk customers and enhance retention strategies.
Upsell/Cross-Sell Model: Boosted customer engagement by 25% and increased average order value by 10% through a market basket analysis-based recommendation system for telecom operators.
Pack Similarity Model: Built and deployed a pack similarity model to identify and merge similar product packs, optimizing offerings and supporting new pack introductions.
Next Purchase Date Model: Built a time series forecasting pipeline to predict next-purchase dates, engineering temporal features (recency, frequency, monetary) and tuning an XGBoost model achieving 62% accuracy and enabling proactive customer engagement.
Cross-Functional Collaboration: Worked closely with global telecom clients and internal teams to align analytics solutions with business needs.
Hands-on experience in Linux machines.
I have completed a Master's in Computer Science with a specialization in Data Science and Analysis. The program went beyond foundational knowledge, offering advanced expertise in machine learning, programming, and data engineering. With a curriculum encompassing cutting-edge topics such as neural networks, cloud computing, and big data infrastructure, I developed both horizontal breadth and vertical depth in technical and analytical skills. This training has equipped me to design innovative algorithms, optimize data-driven solutions, and extract actionable insights, positioning me to thrive in the rapidly evolving landscapes of technology and business.
They were splendid four years of my academic life where I experienced the most challenging and memorable times.
One of the best two years in my entire academic life where I identified myself.
Having completed the Google Data Analytics Professional Certificate, I've gained proficiency in spreadsheets, SQL, and data processing techniques. This has empowered me to effectively prepare, process, analyze, and share data for informed decision-making.
Gained foundational understanding of AWS Cloud concepts, services, security, architecture, pricing, and support.
Acquired knowledge of cloud concepts, Azure services, Azure workloads, security, privacy, pricing, and support.
LLMs, Gemini, Mistral, NLP, Prompt Engineering, TensorFlow, Transformers, Scikit-learn, Hugging Face, Ollama
Docker, Git, FastAPI, Flask, Streamlit, Airflow, MLOps, CI/CD, Langfuse, Poetry
Pandas, NumPy, SciPy, Dask, Vaex
Python, SQL
GCP, Vertex AI, BigQuery, Linux, Firestore, Jupyter Notebooks
Matplotlib, Seaborn
RabbitMQ
Neo4j, Kuzu
Agile / Scrum
Served as General Secretary of Computer Science Association
My Services
View all my works here
An AI engineer with 3 years of experience in predictive modeling, recommendation systems, LLMs, agentic AI, and knowledge graphs. Open to all kinds of challenges—learning is inevitable.