Seeking an accomplished and detail-oriented Snowflake Data Engineer to build and optimize modern data ecosystems, design and implement secure and high-performing data pipelines, and collaborate with cross-functional teams.
Requirements
- BSc/MSc in Computer Science, Data Engineering, or related field
- Snowflake certifications (SnowPro Core, Advanced) highly desirable
- Minimum 5–8 years in data engineering, data warehousing, or data architecture roles
- At least 3+ years working with Snowflake
- Proven experience in data engineering and pipeline development on Snowflake and cloud-native platforms
- Strong SQL and Python (or equivalent language) skills for data manipulation and automation
- Hands-on experience with cloud platforms (AWS, Azure, GCP)
- Knowledge of data modelling methodologies (star schemas, Data Vault, Kimball, Inmon)
- Familiarity with data lake architectures and distributed processing frameworks (e.g., Spark, Hadoop)
- Experience with version control tools (GitHub, Bitbucket) and CI/CD pipelines
- Understanding of data governance, security, and compliance frameworks
Benefits
- Competitive salary
- Opportunities for professional growth and development
- Collaborative and dynamic work environment
- Flexible working hours and remote work options