Are you ready to be part of an organisation that values expertise, passion and diversity? At CFS we know that the foundation of our success lies in our exceptional people. We believe in celebrating individuality, have a passion for high performance and creating an environment where you can unleash your full potential. Our people enable us to make a difference and deliver exceptional experiences to help our customers achieve financial freedom.
The opportunity
As a Senior Data Engineer you will work on product database uplift. You will construct ELT/ETL pipelines on medallion layers in Azure Databricks using both DLT and DBT to ingest and transform data from multiple downstream sources for batch and real-time workloads.
This is a 6 month contingent role working at CFS with a view to extend.
Your Responsibilities
- Creating and deploying Delta Live Tables datasets (Streaming Tables, Materialised View in Azure Databricks using Azure DevOps.
- Transforming managed tables to external tables and deploying them into external locations (more specifically integrating ADLS gen 2)
- Creating and deploying Foreign Tables using Databricks Lakehouse federation.
- Implementing hash techniques in data layers and loading incremental changes from various downstream sources such as SQL Server, AccessDB, Teradata, using Databricks Lakehouse federation.
- Implementing and deploying encryption methods in Azure Functions using .Net/Python modules to encrypt/decrypt data in Azure
- Integrating Databricks and Power BI using MS Data Gateway and implementing DAX queries to access Databricks Gold datasets.
Your Capability and Experience
- Strong hands-on experience in Microsoft cloud services such as Azure Databricks (includes Unity Catalog), Azure Data Factory, Azure Functions, App Services, SHIR, Azure Storage, Azure DataLake, Azure KeyVault, Azure DevOps & Power BI
- High proficiency in programming languages like Pyspark, Python, SQL is essential
- Strong understanding of data engineering principles, such as data modeling, ETL processes data warehousing, and data governance.
- Demonstrable experience in designing and implementing data pipelines with medallion architecture from multiple sources using Azure Databricks and Azure DevOps/GitHub.
- Experience in cloud data migration to Databricks
CFS Culture
At CFS, you'll be working among the very best in the wealth management industry. It's an inspiring environment that encourages development and celebrates success.
At CFS we are committed to creating a thriving environment where individuals can flourish. We believe that success is built upon strong teams, and we are dedicated to celebrating uniqueness, championing individuality and supporting a diverse and inclusive workforce. We believe that when you can truly be yourself, you can unlock your full potential.
Apply today and join us in helping Australians to achieve their financial freedom.
Please note, CFS requires all candidates to have full work rights in Australia.
This role is based in Gadigal land (Sydney).
Where we have preferred candidates, background checks (including Police, Employment, Bankruptcy checks, ASIC banned and disqualified persons) will be completed prior to the final preferred candidate's employment being confirmed. The outcomes of the background checks do not preclude the preferred candidate, however, they will be assessed against the inherent requirements of the role.