Design robust hybrid data pipelines, develop central AI data architecture, and implement End-to-End MLOps. Work with Terraform, Python, and modern workflows like Git and containerization. Collaborate with data scientists and translate their requirements into performant infrastructure.
Requirements
- Engineering mindset with experience in infrastructure as code (e.g. Terraform), modern workflows like Git and containerization (e.g. Docker)
- Data engineering & scripting expertise in a modern programming language (e.g. Python) and complex data pipelines
- Cloud and hybrid architecture with experience in cloud services for data integration and storage (e.g. Azure Data Factory, Data Lake)
- Good German language skills in word and script
- Basic knowledge of container orchestration (e.g. Kubernetes)
Benefits
- Joint team events
- EGYM Wellpass & Occupational pension
- Free drinks, coffee, tea
- Food delivery with Cateroo
- City car (car sharing)
- Extensive opportunities for further education
- Flexible working hours, remote work & workation possible
- Apple hardware & modern office with high-quality equipment