What You’ll Be Doing
As a Senior Data Engineer, you will be responsible for designing, building, and optimizing scalable data solutions within a modern Azure lakehouse environment. You will play a leading role in establishing robust data foundations and work closely with data scientists and business stakeholders to turn data into actionable insights.
Your responsibilities include:
- Leading the design and implementation of scalable ETL/ELT pipelines on Databricks
- Working within a medallion architecture (bronze, silver, gold)
- Building reusable, metadata-driven pipeline frameworks
- Managing and optimizing Delta Lake and Azure Data Lake Storage Gen2
- Setting high standards for data quality, governance, and performance
- Implementing monitoring, alerting, and observability
- Developing APIs and backend services in Python (FastAPI, Flask, or Django)
- Delivering reusable, production-grade components for wider team usage
- Supporting lightweight front-end solutions (e.g. React or Angular) where needed
- Engineering and evolving an Azure-based lakehouse platform
- Using Terraform and Azure DevOps for IaC, CI/CD, and deployment automation
- Integrating data with ERP and external systems (MS Dynamics is a plus)
- Mentoring and supporting junior data engineers
- Promoting best practices such as TDD, clean code, and solid documentation
- Collaborating with data scientists on AI and machine learning use cases
- Staying up to date with emerging tools and technologies and applying them where relevant
What You’ll Bring
- Bachelor’s degree in Computer Science, Data Engineering, or a related field
- 5+ years of hands-on experience in data engineering
- 1–2+ years in a senior or lead role
- Strong Python skills (PySpark for pipelines + API frameworks)
- Solid SQL skills with experience optimizing complex queries
- Proven experience with Databricks, Delta Lake, and distributed data processing
- Experience with AI/ML pipelines or working closely with data science teams
- Hands-on experience with Azure Cloud, Terraform, and Azure DevOps
- Experience designing APIs and integration patterns (REST/GraphQL)
Nice to have:
- Experience with Apache Airflow or similar orchestration tools
- Familiarity with Docker and Kubernetes
- Front-end experience (React, Angular, or Vue)
- Knowledge of data modeling (Kimball, Data Vault, star/snowflake schemas)
What You Can Expect
- Salary up to €90,000 gross per year
- Strong secondary benefits package
- A collaborative, hands-on, and technically strong team
- Plenty of room for ownership and technical decision-making
- The opportunity to work with modern data and AI technologies