Join a fast-paced and talented team to deliver Data Engineering capabilities for The Hartford’s Commercial Data Science Data Delivery. You will have an opportunity to engage in enabling well architected cloud-based data solutions for Entity Resolution using emerging technologies such as AWS and Snowflake.
This role will have a Hybrid work arrangement, with the expectation of working in an office (Hartford, CT, Charlotte, NC, ) 3 days a week (Tuesday through Thursday).
Responsibilities:
- Accountable as the subject matter expert and/or technical lead for a large-scale data products. Drive End-to-End solution delivery involving multiple platforms and technologies with medium to large, complexity or oversee certain parts of very large complex implementations, leveraging ELT solutions to acquire, integrate, and operationalize data.
- Partner with architects and stakeholders to influence and implement the vision of the pipeline and data product architecture while safeguarding the integrity and scalability of of the environment.
- Articulate risks and tradeoffs of technology solutions to senior leaders with translations as needed for business leaders.
- Accountable for data pipeline and product physical solution designs across teams as well as tool recommendations
- Accountable for Data Engineering Practices across all the teams involved.
- Implement and utilize leading big data methodologies (AWS, Hadoop/EMR, Spark, Kafka, Snowflake and Talend) with cloud/on premise hybrid hosting solutions, on a multi-team/product level.
Knowledge, Skills, and Abilities:
- Strong Technical Knowledge (Cloud data pipelines and data consumption products)
- Leader and a team player with transformation mindset.
- Ability to lead successfully in a lean, agile, and fast-paced organization, leveraging Scaled Agile principles and ways of working.
- Guides team to mature Code quality management, DataOps principles, automated testing, and environment management practices to deliver incremental customer value.
Qualifications:
- 5+ years in large scale big data engineering experience and designing best practices in Programming, SDLC practices, Distributed systems, Data warehousing solutions SQL and NoSQL, ETL tools, CICD, Cloud Technologies (AWS/AZURE),Python/Spark, Datamesh and Datalake, Data Fabric
- 3+ years of developing and operating production workloads in cloud infrastructure (AWS, Azure, etc)
- 3+ years of operating in a technical leadership capacity for 2+ teams.
- Must be authorized to work in the U.S. without company sponsorship
Preferred Qualifications:
- Exposure to AWS best practices
- Knowledge of core functional components/services of AWS – compute, storage, Edge, Database, Migration and Transfer, Networking, and Governance.
Certifications/Licenses (as applicable)
- Cloud certifications preferred.