Senior IC role designing, building, and operating data architecture for a large-scale consumer software platform, partnering with stakeholders, data scientists, and product teams.
Requirements
- 8+ years of hands-on ETL/ELT pipeline development across varied data sources
- Strong programming skills in Python, Scala, or Java (production-quality code)
- Experience with modern data platforms — Snowflake, Databricks, Apache Spark, Kafka, Airflow
- Cloud platform experience — AWS, Azure, or GCP and their native data services
- Experience with real-time data processing and streaming architectures
- Solid data modeling, warehousing, and dimensional modeling fundamentals
- Knowledge of containerization and orchestration (Docker, Kubernetes)
- Practical knowledge of MCP and AI-assisted development tools
- Familiarity with DataOps and MLOps practices
- Experience managing sensitive/PII data with attention to compliance and governance
- Strong communication skills across technical and non-technical stakeholders
Benefits
- bonus
- pension
- medical/dental/vision
- generous PTO
- paid parental leave