We're seeking a Data Engineer to support our Director of Engineering and collaborate with cross-functional teams to deliver robust and scalable data solutions for our clients. This is a contract position, 100% remote within LatAm, with a 4-6 month contract period starting in August 2025.
Requirements
- 10+ years of data engineering experience with enterprise-scale systems
- Expertise in Apache Spark and Delta Lake, including ACID transactions, time travel, Z-ordering, and compaction
- Deep knowledge of Databricks (Jobs, Clusters, Workspaces, Delta Live Tables, Unity Catalog)
- Experience building scalable ETL/ELT pipelines using tools like Airflow, Glue, Dataflow, or ADF
- Advanced SQL for data modeling and transformation
- Strong programming skills in Python (or Scala)
- Hands-on experience with data formats such as Parquet, Avro, and JSON
- Familiarity with schema evolution, versioning, and backfilling strategies
- Working knowledge of at least one major cloud platform (AWS, GCP, or Azure)
- Experience designing data architectures with real-time or streaming data (Kafka, Kinesis)
- Consulting or client-facing experience with strong communication and leadership skills
- Experience with data mesh architectures and domain-driven data design
- Knowledge of metadata management, data cataloging, and lineage tracking tools
- Familiarity with healthcare standards (e.g., HL7, FHIR, DICOM) is a plus
- Awareness of international data privacy regulations and compliant system design
Benefits
- Opportunity to work with a growing company and contribute to its success
- Collaborative and dynamic work environment
- Chance to build strong relationships with clients and colleagues
- Professional development and growth opportunities
- Competitive salary and benefits