Senior Data Engineer
- Design and build a Data Warehouse from disparate sources. Build and maintain the core data model, ETL / ELT, core data metrics and data quality
- Rapidly prototype new analytics views and work directly with stakeholders across multiple functions (Science, Marketing, Sales, Risk, Finance, Product)
- Champion data warehousing best practices
- Build systems to answer business questions in a timely fashion and expand our product features
- Architect, build and launch new data models that provide intuitive analytics to business users
- Develop infrastructure to inform on key metrics, recommend changes and predict future results
- Work closely with other departments to gather new data and leverage existing data to make our products better for us and our users
- Build data expertise and own data quality for the pipelines you build
- Design and develop new systems and tools to enable folks to consume and understand data faster
- Provide expert advice and education in the usage and interpretation of data systems to the business users
- B.S. or B.A. in Computer Science/Applications or a related field
- 5+ years of SQL and dynamic or static programming languages experience as applied to ETL/ELT using DBT
- 5+ years of experience using Google BigQuery, Vertica, or similar Data Warehouses for dimensional data modeling and schema design. Work directly with SQL to both profile data and generate analytics
- 5+ years of experience working in Python/Scala programming languages
- Demonstrated experience performing shell scripting in a Linux/Unix environment
- Develop infrastructure in AWS and GCP Cloud environment to inform on key metrics, recommend changes
- Experience with relational databases and NoSQL Databases
- Experience with Data Orchestration Framework like Airflow, Cloud Composer
- Experience in developing dashboards in Looker and/or Tableau
- Applicant must be willing to provide 24x7 on call support one week per month
- Experience working in cloud deployments such as AWS and GCP is a plus
- Experience working on distributed processing frameworks like Spark, Flink
- Familiarity with lakehouse architecture using Apache Iceberg
- Familiarity with messaging services like SQS, SNS, PubSub
- Experience working with Terraform to manage infrastructure in AWS and GCP
- Excellent communication skills including the ability to identify and communicate data driven insights
- Comprehensive health benefits including medical, dental, vision, life, and disability, as well as a Life Solutions Plan covering mental health benefits
- Industry leading 401K match of up to 10%
- Discounted access to Employee Share Purchase Plan program
- Professional growth opportunities including up to $10,000 college tuition reimbursement, access to upskilling platform, leadership training, mentoring and coaching programs, and short-term assignments (domestic and international)