Azure Databricks Engineer - (ETL/Data Modeling/SQL/Python/Data Integration/Ingestion/Informatica IICS/Banking/Fintech)

Azure Databricks Engineer - (ETL/Data Modeling/SQL/Python/Data Integration/Ingestion/Informatica IICS/Banking/Fintech)

Posted Today by GIOS Technology

Negotiable
Undetermined
Hybrid
London Area, United Kingdom

Summary: The Azure Databricks Engineer role involves designing, building, and maintaining scalable data pipelines and ELT workflows using Databricks with a focus on Medallion architecture. The position requires collaboration with various stakeholders to ensure data quality and compliance, while also documenting processes for operational support. The engineer will participate in Agile teams to contribute to solution design and platform evolution, particularly in the insurance domain.

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines and ELT workflows on Databricks with Medallion architecture.
  • Apply data modeling techniques, including Data Vault, and implement data quality measures.
  • Collaborate with Data Architects, Analysts, Product Managers, and Testers to deliver accurate datasets.
  • Document data flows, transformation logic, and processes for knowledge sharing.
  • Participate in Agile teams, contributing to solution design, code reviews, and platform evolution.
  • Analyze, integrate, and transform insurance domain data, ensuring alignment with business and compliance requirements.

Key Skills:

  • SQL
  • Python
  • Databricks
  • Delta Lake
  • ADF
  • Informatica IICS
  • Azure
  • Data Modeling
  • ELT
  • Data Vault
  • Data Integration
  • Data Quality
  • Agile
  • Jira
  • Azure DevOps

Salary (Rate): undetermined

City: London

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

I am hiring for Azure Databricks Engineer - (ETL/Data Modeling/SQL/Python/Data Integration/Ingestion/Informatica IICS/ADF/Delta Lake/Notebooks/Oracle/SQL Server/Banking/Fintech) Location: London, UK - 3 days onsite/week Job Description: Design, build, and maintain scalable data pipelines and ELT workflows on Databricks with Medallion architecture . Apply data modeling techniques, including Data Vault , and implement data quality measures for reliable data structures. Collaborate with Data Architects, Analysts, Product Managers, and Testers to deliver accurate and actionable datasets. Document data flows, transformation logic, and processes for knowledge sharing and ongoing operational support. Participate in Agile teams , contributing to solution design, code reviews, and platform evolution. Analyze, integrate, and transform insurance domain data , ensuring alignment with business and compliance requirements.

Key Skills: SQL, Python, Databricks, Delta Lake, ADF, Informatica IICS, Azure, Data Modeling, ELT, Data Vault, Data Integration, Data Quality, Agile, Jira, Azure DevOps