We’re looking for a Data Engineer who thrives on solving complex data challenges, building scalable pipelines, and delivering reliable insights that drive business decisions.
Requirements
- Design, build, and maintain data pipelines for large-scale ingestion, transformation, and integration from multiple data sources.
- Develop and optimize ETL/ELT workflows leveraging tools such as Spark, Scala, and SQL.
- Implement data models and data warehousing solutions on Snowflake and cloud environments (AWS/GCP).
- Manage data quality, governance, and security across environments.
- Optimize performance of data pipelines, focusing on scalability, efficiency, and cost optimization in cloud setups.
- Collaborate with DevOps teams for CI/CD deployment and monitoring of data solutions.