£470 Per day
Inside
Hybrid
London, UK
Summary: The Data Engineer role involves designing, building, and maintaining Azure Databricks data pipelines and ELT workflows for a consultancy's insurance client in London. The position is hybrid, requiring three days on-site work. Candidates should have a strong background in data engineering, particularly within large corporate environments, and experience with various data integration tools and Azure services. Relevant certifications and strong communication skills are highly desirable.
Key Responsibilities:
- Design, build, and maintain Azure Databricks data pipelines and ELT workflows.
- Work with Medallion architectures to deliver reliable, well-modelled data sets for analytics and reporting.
- Integrate and transform insurance domain data.
- Utilize data integration tools such as Informatica IICS and Azure Data Factory.
- Collaborate within agile delivery frameworks and utilize tools like Jira or Azure DevOps.
- Ensure data quality and manage master data effectively.
- Address data security considerations in Databricks.
Key Skills:
- Proven background as a Data Engineer in large corporate environments.
- Strong hands-on experience with SQL and Python.
- Experience with Delta Lake, data warehousing technologies, and Azure cloud services.
- Familiarity with both on-prem and cloud databases such as Oracle and SQL Server.
- Knowledge of mass ingestion patterns and cloud data processing.
- Strong communication and teamwork skills.
- Relevant Azure or Databricks certifications are highly desirable.
Salary (Rate): £470 per day
City: London
Country: UK
Working Arrangements: hybrid
IR35 Status: inside IR35
Seniority Level: undetermined
Industry: IT
Data Engineer - Azure Databricks - 6 month contract - London Hybrid (3 days onsite)
I am working with a well known consultancy who are looking for an experienced Data Engineer to join an ongoing project in London. The role will be Hybrid and flexibility to work 3 days onsite is expected.
You will design, build and maintain Azure Databricks data pipelines and ELT workflows, working with Medallion architectures to deliver reliable, well-modelled data sets for analytics and reporting for their Insurance client.
As such, I am keen to speak with candidates who have:
- Proven background working as a Data Engineer in large, corporate environments
- Strong hands-on experience with SQL and Python.
- Background in data integration/ingestion using tools such as Informatica IICS, Azure Data Factory, notebooks and Databricks.
- Experience with Delta Lake, data warehousing technologies and Azure cloud services.
- Proven experience modelling, integrating and transforming insurance domain data,
- Experience with both on-prem and cloud databases such as Oracle and SQL Server.
- Familiarity with agile delivery frameworks (eg Scrum, SAFe) and tools such as Jira or Azure DevOps.
- Knowledge of mass ingestion patterns, cloud data processing, data quality and master data management.
- Understanding of data security considerations and tooling in platforms such as Databricks.
- Strong communication and teamwork skills; relevant Azure or Databricks certifications are highly desirable.
Interested? Apply now for immediate consideration!