Senior Data Engineer - Snowflake

Senior Data Engineer - Snowflake

Posted 2 days ago by Eden Smith Limited

Negotiable
Outside
Hybrid
Central London, UK

Summary: The Senior Data Engineer role focuses on designing and delivering a Snowflake-based Data Lakehouse to enhance trading, operational, and marketing intelligence. The position requires an experienced engineer to architect scalable data platforms, particularly in financial or trading environments. The role involves collaboration with various business units to ensure data governance and quality while implementing robust ETL/ELT pipelines. This is a high-impact opportunity for someone with a proven track record in data engineering and infrastructure development.

Key Responsibilities:

  • Design and build a Snowflake-centric Data Lakehouse integrating structured, semi-structured, and unstructured data.
  • Develop robust ETL/ELT pipelines ingesting and transforming data from multiple internal systems (trading, CRM, finance, risk, etc.) and external APIs.
  • Implement data models, schemas, and transformation frameworks optimised for analytical and regulatory use cases.
  • Apply best practices in data versioning, orchestration, and automation using modern data engineering tools.
  • Ensure scalability, data lineage, and governance across the data lifecycle.
  • Define and implement standards for data validation, cataloguing, and documentation.
  • Maintain high data integrity, privacy, and security aligned with FCA and GDPR requirements.
  • Monitor and optimise query performance and storage efficiency.
  • Partner with business units (Trading, Finance, Marketing, Compliance) to capture data requirements and translate them into robust technical solutions.
  • Support regulatory, management, and operational reporting through structured data models.
  • Define the core Snowflake-based architecture and data model.
  • Build foundational ETL/ELT pipelines from primary systems and third-party data sources.
  • Establish automated CI/CD workflows for data operations.
  • Deliver the first version of the Data Lakehouse to support BI and compliance analytics.

Key Skills:

  • Proven experience designing and delivering a DWH/Delta Lakehouse using Snowflake.
  • Strong SQL and data modelling expertise (star/snowflake schemas, dimensional modelling).
  • Experience integrating SQL/NoSQL databases and external APIs.
  • Proficiency in Python or another Scripting language.
  • Familiarity with orchestration and transformation frameworks.
  • Hands-on experience or strong understanding of data analysis, visualisation, and operational reporting tools is highly desirable.
  • Experience in financial services, trading, or fintech environments preferred.
  • Excellent communication skills with the ability to translate business requirements into scalable data architecture.

Salary (Rate): £550 daily

City: London

Country: UK

Working Arrangements: hybrid

IR35 Status: outside IR35

Seniority Level: Senior

Industry: IT

Detailed Description From Employer:

Senior Data Engineer - Snowflake (3 months initial contract w/extension)

Location: London, UK (hybrid)

Rate: £450-550 Outside IR35

Overview

A next-generation data platform is being built to power trading, operational, and marketing intelligence. This role will design and deliver a Snowflake-based Data Lakehouse, establishing a modern data ecosystem that supports advanced analytics, compliance, and decision-making across the business.

This is a high-impact opportunity for an experienced engineer with a proven track record of architecting and implementing scalable data platforms - ideally within financial or trading environments - and leading the build-out of data infrastructure from the ground up.

Key Responsibilities

Architecture & Development

  • Design and build a Snowflake-centric Data Lakehouse integrating structured, semi-structured, and unstructured data.

  • Develop robust ETL/ELT pipelines ingesting and transforming data from multiple internal systems (trading, CRM, finance, risk, etc.) and external APIs.

  • Implement data models, schemas, and transformation frameworks optimised for analytical and regulatory use cases.

  • Apply best practices in data versioning, orchestration, and automation using modern data engineering tools.

  • Ensure scalability, data lineage, and governance across the data lifecycle.

Data Governance & Quality

  • Define and implement standards for data validation, cataloguing, and documentation.

  • Maintain high data integrity, privacy, and security aligned with FCA and GDPR requirements.

  • Monitor and optimise query performance and storage efficiency.

Cross-Functional Collaboration

  • Partner with business units (Trading, Finance, Marketing, Compliance) to capture data requirements and translate them into robust technical solutions.

  • Support regulatory, management, and operational reporting through structured data models.

Key Objectives

  • Define the core Snowflake-based architecture and data model.

  • Build foundational ETL/ELT pipelines from primary systems and third-party data sources.

  • Establish automated CI/CD workflows for data operations.

  • Deliver the first version of the Data Lakehouse to support BI and compliance analytics.

Skills & Experience

  • Proven experience designing and delivering a DWH/Delta Lakehouse using Snowflake.

  • Strong SQL and data modelling expertise (star/snowflake schemas, dimensional modelling).

  • Experience integrating SQL/NoSQL databases and external APIs.

  • Proficiency in Python or another Scripting language.

  • Familiarity with orchestration and transformation frameworks.

  • Hands-on experience or strong understanding of data analysis, visualisation, and operational reporting tools is highly desirable.

  • Experience in financial services, trading, or fintech environments preferred.

  • Excellent communication skills with the ability to translate business requirements into scalable data architecture.

Qualifications

  • Degree in Computer Science, Data Engineering, or a related field.

  • 5-8 years of hands-on experience in data engineering or infrastructure development.