Are you an experienced Data engineer with a strong background in data migration or integration? Do you have a strong cloud experience in AWS? We are seeking a talented Data Engineer to join a team that is embarking upon a major data integration program of work and play a key role in the project.
The Role
- Develop and manage data processing workflows using Python and dbt on AWS infrastructure.
- Enhance and fine-tune data models and transformations to improve performance.
- Monitor data quality and implement appropriate testing mechanisms to ensure accuracy.
- Collaborate with data stakeholders to understand their requirements and deliver solutions.
- Maintain comprehensive documentation for data models, workflows, and operational processes.
Required Skills
- Python: Strong command of Python for developing ETL workflows, data handling, and automation scripts.
- dbt (Data Build Tool): Proficient in utilising dbt for transforming data, building models, and maintaining documentation.
- SQL: Expert-level understanding of SQL for complex data queries and manipulation.
- Data Warehousing: Solid grasp of data warehousing principles and best practices.
- ETL/ELT Development: Proven ability to design and execute robust ETL/ELT pipelines.
- AWS Proficiency: Hands-on experience with key AWS services such as:
- Amazon S3 for storing large datasets.
- Amazon Redshift for managing data warehouses.
- AWS Lambda for executing serverless tasks.
- AWS Glue and Wrangler for managing ETL tasks. (nice to have)
- DynamoDB for NoSQL database management. (nice to have)
- Amazon Step Functions for workflow orchestration. (nice to have)
Job Ref number: 06810-0013079236 - HS