We are looking for a detail-oriented Senior Data Engineer to define the data pipeline of our future data models and provide dependable business intelligence solutions.
Requirements
- Minimum 5 years of experience in data engineering field
- Expertise in data modelling techniques such as Kimball star schema, Anchor modelling, and Data vault
- Competence in object-oriented or object function scripting languages such as Python
- Proficiency in relational SQL and NoSQL databases, preferably with PostgreSQL, PITR, Pg_basebackup, WAL archival, and Replication
- Familiarity with column-oriented storage or data warehouses such as parquet, Redshift, and Bigquery
- In-depth skills in developing and maintaining ETL/ELT data pipelines and workflow management tools such as Airflow
- Hands-on experience with Google Cloud Platform (GCP) services such as BigQuery, scheduled queries, Cloud Storage and Cloud Functions
- Familiar with alerting and self-recovery methods concerning data accuracy
- Analytical skills with the ability to transform data into optimal business decisions
- Expertise in peer reviewing pipeline codes and suggesting improvements when required
- Experience in helping teams make informed business decisions with data
- Strong communication and presentation skills
- Fluency in spoken and written English
Benefits
- Competitive salary
- Annual performance bonus
- Retirement plan
- Range of health benefits
- Casual dress code