I focus on three complementary service areas, shaped around pipeline reliability, analytical clarity, and automation.
Data Engineering on GCP
For teams that need to centralize fragmented data sources and make them easier to operate.
- Connect APIs, files, ERP, or operational tools into a structured data flow
- Build ETL pipelines for ingestion, transformation, and loading
- Organize and optimize storage on BigQuery
- Orchestrate recurring jobs with Airflow
Stack: GCP, BigQuery, Cloud Storage, Dataflow, Cloud Run, Airflow, Docker, Python, SQL
Analytics and Reporting Foundations
For teams that have data but still lack a clear and reliable analytical layer.
- Model SQL tables around business logic and KPIs
- Prepare reporting foundations for dashboards and recurring analysis
- Clarify definitions so indicators stay consistent over time
- Support operational, logistics, commercial, or marketing analysis
Stack: SQL, BigQuery, SQL Server, Power BI, Looker Studio, Python
Automation for Data Workflows
For teams that spend too much time on repetitive updates, synchronizations, or manual checks.
- Automate recurring data tasks with Python and cloud services
- Reduce manual work in operational reporting flows
- Add monitoring and control points when needed
- Deliver workflows that are easier to maintain over time
Stack: Python, Cloud Run, BigQuery, SQL, lightweight orchestration
Engagement
| Availability | Immediate |
| Location | Paris, on-site or remote in France |
| Working languages | French, English, Spanish |
| Mission length | From one month |
Interested in discussing a mission? Get in touch →