Senior Data Engineer

Senior Data Engineer

Posted 1 week ago by IO Associates

£600 Per day
Undetermined
Hybrid
London

Summary: The Senior Data Engineer role involves leading the design and implementation of a cloud-based digital operations system for major infrastructure programmes. This position focuses on integrating operational technologies into a cohesive data platform, ensuring data governance and compliance. The engineer will work extensively with Azure technologies and collaborate with stakeholders to create data visualizations. The role is hybrid, based in London, and is contract-based.

Key Responsibilities:

  • Architect and implement a scalable Azure data platform, transforming raw operational data into structured, reporting-ready layers.
  • Develop ingestion pipelines for high-frequency time-series and IoT-like data from essential infrastructure systems.
  • Build and maintain cloud data assets using Azure Databricks, Synapse and Data Lake technologies.
  • Deliver Real Time data processing using Kafka, Event Hub and Spark Streaming.
  • Establish and maintain data governance, quality frameworks and PHI/PII compliance.
  • Collaborate with stakeholders to design and deploy a Power BI "Master Dashboard" with DAX and semantic modelling.

Key Skills:

  • Extensive experience with Azure Databricks, Azure Data Factory, Synapse and Unity Catalog
  • Strong proficiency in Python, PySpark and Spark SQL
  • Demonstrated experience with high-frequency time-series data
  • Expertise in Star Schema, Snowflake Schema and Lakehouse architectures
  • Experience with Great Expectations, Grafana or similar observability tools
  • Familiarity with CI/CD pipelines using Azure DevOps or GitHub Actions

Salary (Rate): £600 per day

City: London

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: undetermined

Seniority Level: Senior

Industry: IT

Detailed Description From Employer:

Senior Data Engineer (Contract)
Location: London (Hybrid)

We are seeking an experienced Senior Data Engineer to lead the design and implementation of a cloud-based digital operations system supporting several major infrastructure programmes. This role is central to integrating critical operational technologies into a unified, resilient data platform.

Key Responsibilities

  • Architect and implement a scalable Azure data platform, transforming raw operational data into structured, reporting-ready layers.

  • Develop ingestion pipelines for high-frequency time-series and IoT-like data from essential infrastructure systems.

  • Build and maintain cloud data assets using Azure Databricks, Synapse and Data Lake technologies.

  • Deliver Real Time data processing using Kafka, Event Hub and Spark Streaming.

  • Establish and maintain data governance, quality frameworks and PHI/PII compliance.

  • Collaborate with stakeholders to design and deploy a Power BI "Master Dashboard" with DAX and semantic modelling.


Required Skills & Experience

  • Extensive experience with Azure Databricks, Azure Data Factory, Synapse and Unity Catalog

  • Strong proficiency in Python, PySpark and Spark SQL

  • Demonstrated experience with high-frequency time-series data

  • Expertise in Star Schema, Snowflake Schema and Lakehouse architectures

  • Experience with Great Expectations, Grafana or similar observability tools

  • Familiarity with CI/CD pipelines using Azure DevOps or GitHub Actions