Our Client in Newmarket, ON is looking for a Data Engineer to join their expanding team!
Who You Are:
As an experienced Data Engineer, you'll play a key role in designing, building, and maintaining ELT pipelines on Snowflake for our utility customers. You will collaborate with project managers, business analysts, and testers to deliver top-tier data analytics solutions. The role involves creating reusable templates in DBT, acting as a subject matter expert for data analytics best practices, and providing documentation for various stakeholders. You’ll need strong SQL, data modeling, and analytical skills, along with experience in DBT, Snowflake, and BI tools like Tableau or Power BI. Familiarity with cloud platforms (AWS/Azure) and data governance is also important.
Work Type:
- Remote – Canada
- Office Location: Newmarket, ON, Canada
- Must be eligible to travel to the USA when required
How You Will Impact:
- Design, build, and maintain scalable ELT (Extract, Load, Transform) pipelines on Snowflake to support utility customer data analytics.
- Collaborate with project managers, business analysts, and testers to understand business needs and translate them into efficient data solutions.
- Develop and maintain a library of reusable templates using dbt (data build tool) to streamline the development process.
- Serve as a subject matter expert on data analytics best practices, offering insights and guidance to the team.
- Create and update comprehensive documentation tailored for both technical and non-technical audiences.
- Participate in discovery sessions and requirements gathering as a technical advisor to shape data architecture and strategy.
- Perform ad hoc tasks and analysis as needed to support various business initiatives.
What You Will Bring:
- 5-7 years of experience as a Data Engineer with a track record of successfully delivering data solutions in production environments.
- Deep knowledge of ELT and ETL processes, including extraction from a variety of data sources (structured and unstructured), transformation, and loading into target data stores.
- Experience with cloud computing platforms like AWS (Redshift, S3, Lambda) or Azure (Synapse, Data Factory) for developing and maintaining cloud-based data architectures.
- Familiarity with best practices in data engineering, including CI/CD for data pipelines, automated testing, and version control.
- Expertise in writing complex SQL queries, performance tuning, and optimization for large datasets.
- Strong experience in designing and implementing data models for analytical and operational use cases, with a focus on scalability and efficiency.
- Demonstrated hands-on experience with a data build tool (DBT) for data transformations and Snowflake (or similar cloud data platforms) for managing large-scale data pipelines.
- Proficiency in using Business Intelligence tools such as Tableau, Power BI, or Looker to design and deliver insightful visualizations and reports.
- Strong understanding of data governance, including data privacy regulations (e.g., GDPR), data quality standards, and security best practices for sensitive data.
- Excellent communication skills, both written and oral organizational and leadership skills.
- Previous experience in industries such as utilities, energy, or finance is preferred but not required, with a strong emphasis on data analytics and reporting.
Compensation:
- Salary: $130,000 - $140,000
- Paid Time-Off
- Comprehensive Health Benefits
If you meet 70% or more of the qualifications listed for this position, we strongly encourage you to apply! Are you not yet ready to apply, but interested in learning more about this opportunity? Please reach out to us at info@stackitrecruitment.com for more information. We look forward to hearing from you!