Data Engineer – Compliance Technology
We are looking for an experienced Data Engineer with at least 3 to 4 years of experience to work in our Compliance Technology Engineering team.
You will work closely with team of business and technology experts and collaborate with a strong peer network of technologists across our RMG function. Your work will be focused on providing technical capability to the team and ensuring our delivery meets business and compliance objectives. In all that work the emphasis will be on providing a high performing and stable platform to meet business outcomes for internal and external stakeholders.
To excel in this role, you will be:
Previously worked closely with data architect or data modellers, can independently work on data analysis, data modelling.
Highly proficient in data preparation and integration, able to design, develop and modify data models and troubleshoot performance issues.
Detailed, focused and proud of your sense of technical design strategy.
Excellent at stakeholder management across multiple levels of engagement.
Proactive and have great communication skills.
Good in doing downstream analysis and understanding end to end data flow when multiple systems are involved.
A natural collaborator with a learning mindset, happy to share knowledge and learn from others.
Able to understand technical requirements and translate them into non-technical language and vice versa.
Experienced with working on Agile delivery.
Tertiary qualifications, preferably in a quantitative subject e.g., computer science, engineering.
Expected to have strong at least 3 to 4 years of hands-on experience in SQL.
Familiarity with data governance, data quality and control frameworks would certainly be useful in this role.
Able to creatively use data and insights to uncover new opportunities, identify root causes and underlying risks to recommend solutions to the business.
The ideal candidate will have the following core skills:
Advanced data engineering skills with strong experience using Spark, SQL, Python and Airflow
Strong knowledge of big data querying tools such as Presto or Trino.
Experience in data warehousing platforms like AWS Redshift
Experience in data architecture principles, including data access patterns and data modelling.
Experience with data quality measurement and monitoring
Experience with metadata, including control totals and check sums for load assurance.
Comfortable taking ownership of issues and driving resolution.
Familiar with cloud computing concepts (AWS) including any or all of EC2, S3, IAM, EKS and RDS and Linux
Experienced with DevOps approaches.
Good understanding of Data Warehousing/ETL concepts
Experience with CI/CD tools
Optional but highly desirable: Experience with event-based and message-driven distributed systems, i.e Kafka, Solace Systems