Want to learn more about this role and Jobot? Click our Jobot logo and follow our LinkedIn page!
Job details
100% REMOTE Lead Data Engineer / Principal Data Platform Engineer Needed for Growing Subsidiary of a Large Public Company!
This Jobot Job is hosted by Reed Kellick
Are you a fit? Easy Apply now by clicking the "Easy Apply" button and sending us your resume.
Salary $185,000 - $235,000 per year
A Bit About Us
We are a growing subsidiary of a large public company that is hiring multiple data engineers! If you are a Principal Data Engineer / Staff Data Platform Engineer, please read on!
Why join us?
As a Lead Data Infrastructure Engineer / Staff Data Engineer in our company, we are able to offer
- A competitive base salary between $185k and $235k, depending on experience!
- Stock grant of $12k to $40k, depending on experience!
- Bonus of 12-20%, depending on seniority!
- Work from home / work remote 100%!
- 401k with dollar for dollar match, up to 6% of eligible earnings (base, bonus). Plus additional company contribution!
- Comprehensive medical, dental, vision and life insurance!
- 17 paid holidays per year, including 3 floating holidays!
- Annual Paid Time Off (PTO), with separate sick days!
- 12 weeks paid Parental Leave!
- Caregiver Leave!
- Adoption and Surrogacy Assistance Plan!
- Flexible workplace accommodation!
- Fun team/company events at Sports games, concerts, etc.!
- Tuition reimbursement!
- Ability to attend conferences!
- A MacBook Pro and accompanying hardware to do great work!
- A modern productivity toolset to get work done Slack, Miro, Loom, Lucid, Google Docs, Atlassian and more!
- Generous company discounts!
- Eligible for donation matching to over 1.5 million nonprofit organization!
Job Details
As a Principal Data Infrastructure Engineer / Staff Data Infrastructure Engineer on our team, we are looking for
- Completed BS, MS, or PhD in Computer Science, Mathematics, Statistics, Engineering, Operations Research, or other quantitative field
- 7+ years of experience as a Data Engineer writing code to extract, process and store data within different types of data stores (Snowflake, Postgres, DynamoDB, Kafka, Graph databases)
- 2+ years of leading a cross-functional team to deliver data projects, including 1+ year of experience as a reporting manager of a small team
- Strong programming skills in Python and understanding of core computer science principles
- Experience with building batch and streaming pipelines using complex SQL, PySpark, Pandas, and similar frameworks
- Experience with some or most of the following would be ideal Snowflake, Fivetran, DBT Cloud, DataDog, Atlan, Monte Carlo, Airflow (MWAA), EKS (Kubernetes)
Interested in hearing more? Easy Apply now by clicking the "Easy Apply" button.
Want to learn more about this role and Jobot?
Click our Jobot logo and follow our LinkedIn page!