£500 Per day
Outside
Hybrid
London Area, United Kingdom
Summary: The Senior Data Engineer role is a hands-on contract position focused on designing and delivering a modern Snowflake-based data platform. The engineer will be responsible for building core architecture, establishing robust data pipelines, and delivering a scalable Data Lakehouse for BI and compliance analytics. This position requires quick adaptability and ownership of production-ready solutions. The contract is for 3 months with a hybrid working arrangement in London.
Key Responsibilities:
- Define and implement the core Snowflake data architecture and data model
- Build and maintain ETL/ELT pipelines from internal systems and third-party data sources
- Design and establish CI/CD workflows for data engineering and analytics operations
- Deliver the initial version of a Data Lakehouse to support business intelligence and regulatory/compliance use cases
- Work closely with technical and business stakeholders to ensure scalable, reliable data solutions
- Apply best practices across performance, security, and data quality
Key Skills:
- Strong hands-on experience with Snowflake / Data Warehousing / Lakehouse architectures
- Advanced Python development skills for data engineering
- Experience with orchestration and transformation frameworks such as Airflow, dbt, or similar
- Proven ability to design and deliver data platforms end to end
- Comfortable working autonomously in fast-paced environments
Salary (Rate): £500 daily
City: London
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: outside IR35
Seniority Level: Senior
Industry: IT
Senior Data Engineer Contract: 3 months (rolling) Start: ASAP Location : London - Hybrid £500 (£400- £500) per day OUTSIDE IR35 We are seeking an experienced Senior Data Engineer to help design and deliver a modern Snowflake-based data platform . This is a hands-on contract role focused on building core architecture, establishing robust data pipelines, and delivering the first iteration of a scalable Data Lakehouse to support BI and compliance analytics. This opportunity suits engineers who can move quickly, take ownership, and deliver production-ready solutions with minimal ramp-up time.
Key Responsibilities
- Define and implement the core Snowflake data architecture and data model
- Build and maintain ETL/ELT pipelines from internal systems and third-party data sources
- Design and establish CI/CD workflows for data engineering and analytics operations
- Deliver the initial version of a Data Lakehouse to support business intelligence and regulatory/compliance use cases
- Work closely with technical and business stakeholders to ensure scalable, reliable data solutions
- Apply best practices across performance, security, and data quality
Required Skills & Experience
- Strong hands-on experience with Snowflake / Data Warehousing / Lakehouse architectures (please clearly quantify years of experience and/or number of relevant projects)
- Advanced Python development skills for data engineering
- Experience with orchestration and transformation frameworks such as Airflow, dbt , or similar
- Proven ability to design and deliver data platforms end to end
- Comfortable working autonomously in fast-paced environments
Interviews can be scheduled at short notice, and candidates available immediately or on short notice will be prioritised.