Job Summary Sydney Contract JN -012024-1953857 Jan 12, 2024 $800 - $1000 pd Job Description The exciting opportunity to develop and maintain efficient data systems for informed decision-making within Financial Services sector Morgan McKinley is seeking an experienced Data Engineer, to join a Financial Services organisation that thrives on innovation and harnessing the power of data to drive strategic decision-making. It is an opportunity to play a pivotal role in developing cutting-edge solutions in a collaborative and progressive work environment. Responsibilities: Apply domain knowledge to solve complex problems and contribute to engineering outputs at a production level of quality through the creation of code artifacts. Develop and implement ETL/ELT processes using a variety of tools, including Azure Data Factory, Azure Purview, Azure Databricks, and others. Emphasize data security within the platform, ensuring proper access control, encryption, and other necessary security measures. Your Expertise and Experience: Demonstrate a deep understanding of data engineering principles, covering data modeling, ETL processes, data warehousing, and data governance. Implement medallion architecture data pipelines from multiple sources using Azure Databricks. Hands-on experience in Autoloader, DLT, performance tuning, and code optimization within the Databricks environment. Possess in-depth knowledge and practical experience in Azure Cloud services, including Data Factory, Function apps, ADLS, Blob, Event Hub, etc. Execute cloud data migration to Azure cloud seamlessly. Excel in relational database design with a strong background in normalization. Essential proficiency in programming languages such as Pyspark and Python. As well as, SQL being highly desirable. This role is based in Sydney and suitable candidates need to have full working rights in Australia. If this sounds like you, click on the link to apply! Recruitment Consultant: Amy Turley