As a Big Data Solutions Architect, you will work with clients on short to medium term customer engagements on their big data challenges using the Databricks platform.
Requirements
- 6+ years experience in data engineering, data platforms & analytics
- Comfortable writing code in either Python or Scala
- Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one
- Deep experience with distributed computing with Apache SparkTM and knowledge of Spark runtime internals
- Familiarity with CI/CD for production deployments
- Working knowledge of MLOps
- Design and deployment of performant end-to-end data architectures
- Experience with technical project delivery - managing scope and timelines.
- Documentation and white-boarding skills.
- Experience working with clients and managing conflicts.
- Build skills in technical areas which support the deployment and integration of Databricks-based solutions to complete customer projects.
Benefits
- Comprehensive benefits and perks