Azure Databricks Engineer

Azure Databricks Engineer

Posted Today by RED Global

Negotiable
Undetermined
Hybrid
London Area, United Kingdom

Summary: The Azure Databricks Engineer role involves designing, building, and maintaining data pipelines and ELT workflows on the Databricks platform, with a focus on the Medalion architecture. The position requires collaboration with various stakeholders to deliver reliable data sets and document data processes. Candidates should possess strong hands-on skills in SQL, Python, and data integration, particularly within the insurance domain. The role is hybrid, requiring three days per week in the office in London.

Key Responsibilities:

  • Design, build, and maintain data pipelines and ELT workflows on the Databricks platform.
  • Analyze data requirements and apply data modeling and quality techniques.
  • Collaborate with data architects, analysts, product managers, and testers to deliver reliable data sets.
  • Document data flows, transformation logic, and processes for knowledge sharing.
  • Participate in agile teams, code reviews, solution design, and platform evolution.

Key Skills:

  • Extensive hands-on experience in SQL, Python, and data integration/ingestion.
  • Proficiency in ETL tooling such as Informatica IICS, ADF, and Databricks.
  • Experience in integrating and transforming data within the insurance domain.
  • Familiarity with on-prem and cloud databases like Oracle and SQL Server.
  • Knowledge of Agile delivery frameworks and tools like Jira and AzureDevOps.
  • Understanding of data security challenges and tooling.
  • Advanced verbal and written communication skills.
  • Professional certifications in Databricks and Azure are highly desired.

Salary (Rate): undetermined

City: London

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Azure Databricks Engineer 10 Month Contract Hybrid Working - London (3 days per week in the office)

Role Description:

Strong hands on skills that can be leveraged directly in the deliverable and/or ensuring that their team is effectively working. Design, build and maintain data pipelines and ELT workflows on Databricks platform with Medalion architecture Analyses data requirements and provides data analysis techniques, and applies data modeling (including data vault) and data quality techniques to establish, modify or maintain data structures and their associated components in complex environments Partner with data architects ,data analysts, product manager, and testers to deliver reliable data sets. Document data flows, transformation logic and processes for knowledge sharing and ongoing support. Passionate about solving problems, enjoy connecting the dots between data, strategy and analytics, obsess with generating tangible benefits and high performance. Collaborate in agile teams ,participate in code reviews, solution design and platform evolution

Skills and Experience

Extensive hands on experience in SQL, Python, Data Integration/Ingestion and associated patterns - ETL tooling – Informatica IICS, ADF, Notebooks, Databricks, Delta Lake, Warehousing technologies and associated patterns, Cloud platforms – Azure preferred. Proven experience in integrating, modeling and transforming Insurance domain data, ideally within Lloyd’s, specialty or insurance/reinsurance market. Experience with on-prem and cloud versions of databases such as Oracle and SQL Server. Experience with Agile delivery frameworks/methodologies (e.g. Scrum, SAFe) and tools (i.e. Jira, AzureDevOps). Experience with mass ingestion capabilities and cloud process flows, data quality and master data management Understanding data related security challenges and tooling with specific technologies (e.g. Databricks) Experience and in-depth knowledge of data delivery and associated architecture principles, data modelling concepts, and all steps of data production process Advanced verbal and written communications skills, as well as active listening, along with teamwork. Professional certifications in public cloud and tooling – Databricks and Azure are highly desired.