GREETINGS!!!
We have an urgent requirement for the role of "Big Data Engineer/Developer" at SYDNEY with one of our client for an extendable contract position.
If interested, please respond to meghana@ivega.com.au or call me at +61 280914535.
To excel in this role, you will be:
Previously worked closely with data architect or data modellers, can independently
work on data analysis, data modelling.
Highly proficient in data preparation and integration, able to design, develop and
modify data models and troubleshoot performance issues.
Detailed, focused and proud of your sense of technical design strategy.
Excellent at stakeholder management across multiple levels of engagement.
Proactive and have great communication skills.
Good in doing downstream analysis and understanding end to end data flow when
multiple systems are involved.
A natural collaborator with a learning mindset, happy to share knowledge and learn
from others.
Able to understand technical requirements and translate them into non-technical
language and vice versa.
Experienced with working on Agile delivery.
Tertiary qualifications, preferably in a quantitative subject e.g., computer science,
engineering.
Expected to have strong at least 3 to 4 years of hands-on experience in SQL.
Familiarity with data governance, data quality and control frameworks would certainly
be useful in this role.
Able to creatively use data and insights to uncover new opportunities, identify root
causes and underlying risks to recommend solutions to the business.
The ideal candidate will have the following core skills:
Advanced data engineering skills with strong experience using Spark, SQL, Python
and Airflow
Strong knowledge of big data querying tools such as Presto or Trino.
Experience in data warehousing platforms like AWS Redshift
Experience in data architecture principles, including data access patterns and data
modelling.
Experience with data quality measurement and monitoring
Experience with metadata, including control totals and check sums for load
assurance.
Comfortable taking ownership of issues and driving resolution.
Familiar with cloud computing concepts (AWS) including any or all of EC2, S3, IAM,
EKS and RDS and Linux
Experienced with DevOps approaches.
Good understanding of Data Warehousing/ETL concepts
Experience with CI/CD tools
Optional but highly desirable: Experience with event-based and message-driven
distributed systems, i.e Kafka, Solace Systems