We are seeking a skilled and passionate Data Engineer with deep expertise in Google Cloud Platform (GCP) and Google BigQuery. In this role, you will architect, build, and maintain the scalable data pipelines that are the foundation of our analytics and data science initiatives.
Requirements
- 3-5+ years of hands-on experience in a Data Engineering, Software Engineering, or a similar role.
- Strong proficiency in a programming language such as Python or Java for data processing and automation.
- Expert SQL Proficiency: Mastery of SQL for complex data manipulation, DDL/DML operations, and query optimization.
- Google BigQuery: Proven expertise in using BigQuery as a data warehouse, including data modeling, performance tuning, and cost management.
- GCP Data Services: Hands-on experience building data pipelines using the GCP ecosystem (e.g., Dataflow, Pub/Sub, Cloud Storage, Cloud Composer/Airflow).
- Data Pipeline Concepts: Deep understanding of ETL/ELT principles and data warehousing architecture (e.g., Star Schema, Data Lakes).
- Engineering Mindset: Strong problem-solving and troubleshooting skills with a focus on building scalable, maintainable, and automated systems.
Benefits
- Competitive salary
- Strong insurance package
- Extensive learning and development resources