Data Engineer - (Databricks/ETL/Informatica IICS/ADF/Delta Lake/Azure/Insurance/reinsurance)

Data Engineer - (Databricks/ETL/Informatica IICS/ADF/Delta Lake/Azure/Insurance/reinsurance)

Posted 1 day ago by GIOS Technology

Negotiable
Undetermined
Hybrid
London Area, United Kingdom

Summary: The Data Engineer role focuses on designing, building, and maintaining scalable data pipelines and ELT workflows within a cloud environment, specifically utilizing Databricks and Azure technologies. The position requires collaboration with various teams to ensure the delivery of high-quality datasets, particularly in the insurance domain. Ideal candidates should possess strong engineering skills and a passion for data-centric problem-solving.

Key Responsibilities:

  • Design, build, and maintain ELT pipelines on Databricks using Medallion architecture.
  • Perform data analysis and apply modeling techniques (including Data Vault) to support complex data structures.
  • Integrate datasets across on-prem and cloud systems, ensuring data reliability and quality.
  • Collaborate with architects, product managers, analysts, and testers in agile delivery squads.
  • Document data flows, transformations, and metadata for support and knowledge sharing.
  • Leverage SQL and Python to enable cloud-native, high-performance data solutions.

Key Skills:

  • SQL
  • Python
  • Databricks
  • Delta Lake
  • Azure
  • ETL
  • Data Integration
  • Data Modeling
  • Data Vault
  • Informatica IICS
  • ADF
  • Data Warehousing
  • Agile
  • Insurance

Salary (Rate): undetermined

City: London

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

I am hiring for Data Engineer - (Databricks/ETL/Informatica IICS/ADF/Delta Lake/Azure/Insurance/reinsurance) Location: London (Hybrid – 2–3 days onsite weekly)

Job Description

We are hiring a skilled Data Engineer to design, build, and maintain scalable data pipelines and ELT workflows in a cloud environment. The role involves collaborating with architects, analysts, and product teams to deliver high-quality, reliable datasets for enterprise use. The ideal candidate brings strong hands-on engineering capability, insurance domain exposure, and a passion for solving data-centric problems.

Key Responsibilities:

  • Design, build, and maintain ELT pipelines on Databricks using Medallion architecture.
  • Perform data analysis and apply modeling techniques (including Data Vault) to support complex data structures.
  • Integrate datasets across on-prem and cloud systems, ensuring data reliability and quality.
  • Collaborate with architects, product managers, analysts, and testers in agile delivery squads.
  • Document data flows, transformations, and metadata for support and knowledge sharing.
  • Leverage SQL and Python to enable cloud-native, high-performance data solutions.

Key Skills:

  • SQL
  • Python
  • Databricks
  • Delta Lake
  • Azure
  • ETL
  • Data Integration
  • Data Modeling
  • Data Vault
  • Informatica IICS
  • ADF
  • Data Warehousing
  • Agile
  • Insurance