At BlackRock, technology is the foundation of our business. As a Data Engineer, you’ll build resilient systems that power our global post-trade operations. You’ll design and deliver enterprise-scale software with a focus on reliability, performance, and clean engineering practices.
Requirements
- Partner with domain experts, product, and engineering teams to design canonical data models
- Build and evolve analytics-ready datasets in Snowflake
- Design and develop reliable ELT/ETL pipelines across Snowflake and SQL Server
- Implement robust pipeline patterns such as incremental processing, idempotency, deduplication, and backfill/reprocessing strategies
- Establish and enforce data quality and observability practices
- Optimize analytical performance and cost
- Publish curated data to downstream systems and serving layers
- Drive best practices for documentation, lineage, schema evolution, and secure handling of sensitive data
- Strong SQL skills (advanced querying, query plans, performance tuning) with hands-on experience in Snowflake and/or Microsoft SQL Server
- Proven experience with data modeling for analytics (dimensional modeling / star schemas, conformed dimensions, slowly changing dimensions)
- Hands-on experience designing and implementing ELT/ETL pipelines, including batch and near-real-time patterns
- Proficiency in at least one general-purpose language used for data engineering (e.g., Python, Java, or Scala) for automation, orchestration, and integrations
- Working knowledge of modern data engineering practices: testing for transformations, CI/CD, environment promotion, and operational monitoring
Benefits
- Strong retirement plan
- Tuition reimbursement
- Comprehensive healthcare
- Support for working parents
- Flexible Time Off (FTO)
- Annual discretionary bonus