Senior Data Engineer

Senior Data Engineer

Posted Today by Primus Core

£650 Per day
Outside
Remote
London Area, United Kingdom

Summary: The role of Senior Data Engineer / Data Platform Engineer focuses on developing and managing a Databricks Lakehouse platform within a Data & AI consultancy. The position requires hands-on expertise in data engineering and platform engineering, including designing data pipelines and optimizing performance. The role is remote and classified as outside IR35, offering a competitive daily rate. Candidates should possess strong experience with Databricks and related technologies.

Key Responsibilities:

  • Designing and building end-to-end data pipelines using Databricks (PySpark / SQL / Delta Lake)
  • Developing and managing robust, scalable Lakehouse architectures
  • Implementing Delta Live Tables (DLT) for reliable, testable, production-grade pipelines
  • Optimising performance using Unity Catalog for centralised governance
  • Leveraging Databricks Workflows / Jobs for orchestration
  • Building CI/CD pipelines for data & platform deployments (Terraform, GitHub Actions, Azure DevOps etc.)
  • Enabling real-time and streaming pipelines (Structured Streaming, Auto Loader)
  • Contributing to MLOps workflows, integrating models into production pipelines
  • Collaborating with Data Scientists, Analysts, and Platform teams to improve developer experience and platform usability

Key Skills:

  • Strong experience building pipelines with PySpark / Spark SQL
  • Deep understanding of Delta Lake
  • Databricks Platform Expertise
  • Hands-on experience with Unity Catalog (governance & security), Spark Declarative Pipelines, Databricks Workflows / Jobs
  • Strong understanding of Lakehouse architecture principles
  • Performance & Optimisation
  • Platform / DevOps Engineering, Infrastructure as Code, CI/CD pipelines
  • Strong experience in at least one - Azure, AWS
  • Databricks certifications (Data Engineer Associate / Professional) - Very Nice to Have
  • Experience with Feature Store / MLflow - Very Nice to Have
  • Exposure to Databricks AI / Mosaic AI / Model Serving - Very Nice to Have

Salary (Rate): £650 daily

City: London Area

Country: United Kingdom

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: Senior

Industry: IT

Detailed Description From Employer:

Senior Data Engineer / Data Platform Engineer (Databricks) Contract | Outside IR35 Remote (UK/EU) £550–£650 per day (DOE)

We’re partnering with a forward-thinking Data & AI consultancy that is helping to scale a Databricks Lakehouse platform. They are looking for a Senior Data Engineer / Data Platform Engineer who can operate across both Data Engineering & Platform Engineering. This is an end-to-end hands-on role.

What You’ll Be Doing:

  • Designing and building end-to-end data pipelines using Databricks (PySpark / SQL / Delta Lake)
  • Developing and managing robust, scalable Lakehouse architectures
  • Implementing Delta Live Tables (DLT) for reliable, testable, production-grade pipelines
  • Optimising performance using:
  • Working with Unity Catalog to implement Centralised governance
  • Leveraging Databricks Workflows / Jobs for orchestration
  • Building CI/CD pipelines for data & platform deployments (Terraform, GitHub Actions, Azure DevOps etc.)
  • Enabling real-time and streaming pipelines (Structured Streaming, Auto Loader)
  • Contributing to MLOps workflows, integrating models into production pipelines
  • Collaborating with Data Scientists, Analysts, and Platform teams to improve developer experience and platform usability

Key Skills & Experience:

  • Strong experience building pipelines with PySpark / Spark SQL
  • Deep understanding of Delta Lake
  • Databricks Platform Expertise
  • Hands-on experience with: Unity Catalog (governance & security), Spark Declarative Pipelines, Databricks Workflows / Jobs
  • Strong understanding of Lakehouse architecture principles
  • Performance & Optimisation
  • Platform / DevOps Engineering, Infrastructure as Code, CI/CD pipelines
  • Strong experience in at least one - Azure, AWS

Very Nice to Have:

  • Databricks certifications (Data Engineer Associate / Professional)
  • Experience with Feature Store / MLflow
  • Exposure to Databricks AI / Mosaic AI / Model Serving

If you’re interested in building high-performance Databricks platforms at scale, send your CV to cv@primus-connect.com or click apply.