Data Analytics Engineer - Azure, Fabric, SQL, PySpark/Python

Data Analytics Engineer - Azure, Fabric, SQL, PySpark/Python

Posted 5 days ago by Lorien

£700 Per day
Inside
Remote
London, UK

Summary: The Data Analytics Engineer role focuses on establishing scalable analytics foundations within a modern data platform environment, specifically Microsoft Fabric. This position is initially a contract with the potential for permanent placement, requiring collaboration with insight analysts and technology teams to create robust datasets and metrics. The role emphasizes building semantic models and delivering high-quality insights across the business. Candidates should possess a strong background in analytics engineering and modern data architectures.

Key Responsibilities:

  • Building and optimising semantic models (star schemas, measures, calculation patterns)
  • Creating and maintaining a reusable KPI and metrics library with clear definitions and ownership
  • Delivering business-ready "gold" datasets aligned to medallion architecture principles
  • Defining analytics engineering standards and ways of working (coding standards, documentation, testing, release practices)
  • Establishing data contracts, versioning, and change control
  • Partnering with data engineers to ensure data availability, reliability, and performance
  • Enabling analysts through templates, exemplars, and best-practice guidance

Key Skills:

  • Analytics or BI engineering background
  • Advanced SQL with proven semantic modelling experience
  • Understanding of dimensional modelling and star schemas
  • Experience working with modern data architectures (lakehouse/medallion concepts)
  • Comfortable operating in pre-production environments and setting standards in a greenfield context
  • Microsoft Fabric experience (Lakehouse/Warehouse, pipelines, notebooks, semantic models) and Azure
  • PySpark/Python exposure
  • Experience in commercial, supply chain, or manufacturing domains (healthcare a plus)
  • Previous experience with Snowflake is desirable

Salary (Rate): £700 per day

City: London

Country: UK

Working Arrangements: remote

IR35 Status: inside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Data Analytics Engineer - Azure, Fabric, SQL, PySpark/Python

Contract (Temp to Perm)

Start date: ASAP
Contract length: 3 months initial (with potential to convert to permanent)
Location: Fully remote (occasional travel to Flintshire and London may be required)
Rate: Up to £700 per day (Umbrella)

Overview

We are looking for an experienced Data Analytics Engineer to join a growing Data Centre of Excellence on an initial contract, with the possibility of moving into a permanent role once the contract period concludes.

This is a founding-style position where you'll help establish scalable analytics foundations, enabling consistent, high-quality insight delivery across the business.

You'll work closely with insight analysts and technology teams to create reusable metrics, robust semantic models, and business-ready datasets within a modern data platform environment (Microsoft Fabric).

What the Data Analytics Engineer will be doing

  • Building and optimising semantic models (star schemas, measures, calculation patterns)
  • Creating and maintaining a reusable KPI and metrics library with clear definitions and ownership
  • Delivering business-ready "gold" datasets aligned to medallion architecture principles
  • Defining analytics engineering standards and ways of working (coding standards, documentation, testing, release practices)
  • Establishing data contracts, versioning, and change control
  • Partnering with data engineers to ensure data availability, reliability, and performance
  • Enabling analysts through templates, exemplars, and best-practice guidance

Skills & experience for the Data Analytics Engineer

  • Analytics or BI engineering background
  • Advanced SQL with proven semantic modelling experience
  • Understanding of dimensional modelling and star schemas
  • Experience working with modern data architectures (lakehouse/medallion concepts)
  • Comfortable operating in pre-production environments and setting standards in a greenfield context
  • Microsoft Fabric experience (Lakehouse/Warehouse, pipelines, notebooks, semantic models) and Azure
  • PySpark/Python exposure
  • Experience in commercial, supply chain, or manufacturing domains (healthcare a plus)
  • Previous experience with Snowflake is desirable

Guidant, Carbon60, Lorien & SRG - The Impellam Group Portfolio are acting as an Employment Business in relation to this vacancy.