Develop and implement CI/CD pipelines for Databricks notebooks and jobs, and ensure compliance with data governance and security standards.
Requirements
- US Citizenship
- Bachelor's degree
- Minimum THREE (3) years of total experience in cloud-based data platforms
- Minimum THREE (3) years of experience with Databricks
- Strong scripting skills (Python, Bash)
- Experience with Delta Lake and Unity Catalog
- Strong knowledge of Spark architecture and distributed computing
- Hands-on experience with Terraform or other IaC tools
- Experience with Unity Catalog and Delta Lake
- Experience with data modeling and performance tuning
- Experience with streaming technologies (Kafka, Event Hub)
- Experience with using CI/CD for data pipelines
- Familiarity with Kubernetes and container orchestration
- Excellent problem-solving skills and attention to detail
- Strong communication and collaboration skills, with the ability to work effectively in a team environment
Benefits
- Medical, Rx, Dental & Vision Insurance
- Personal and Family Sick Time & Company Paid Holidays
- Position may be eligible for a discretionary variable incentive bonus
- Parental Leave and Adoption Assistance
- 401(k) Retirement Plan
- Basic Life & Supplemental Life
- Health Savings Account, Dental/Vision & Dependent Care Flexible Spending Accounts
- Short-Term & Long-Term Disability
- Student Loan PayDown
- Tuition Reimbursement, Personal Development & Learning Opportunities
- Skills Development & Certifications
- Employee Referral Program
- Corporate Sponsored Events & Community Outreach
- Emergency Back-Up Childcare Program
- Mobility Stipend