£600 Per day
Outside
Undetermined
London, UK
Summary: The Lead Data Engineer role at a Tier 1 insurance firm involves leading the transformation of data systems from legacy on-premises to a cloud-based platform using Snowflake on Microsoft Azure. The position requires a seasoned professional with hands-on expertise in Snowflake and dbt, capable of guiding teams and influencing design decisions to create a scalable data ecosystem.
Key Responsibilities:
- Lead the design and implementation of scalable, repeatable data pipelines in Snowflake, ensuring efficient ingestion, transformation, and modeling of data.
- Build transformations, calculations, and aggregations using dbt, leveraging reusable and consistent engineering patterns.
- Establish coding standards, data governance practices, and CI/CD workflows for analytics engineering.
- Partner with architecture and platform teams to design robust data models that support downstream analytical and reporting use cases.
Key Skills:
- Proven experience designing and developing data solutions using Snowflake in a production environment.
- Strong expertise in dbt (data build tool) for transformation, modeling, and pipeline orchestration.
- Solid understanding of data warehousing, ETL/ELT design patterns, and analytics engineering principles.
- Hands-on experience with SQL and performance optimization in Snowflake.
Salary (Rate): £600 per day
City: London
Country: UK
Working Arrangements: undetermined
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
.A Tier 1 insurance firm is undertaking a strategic transformation of its data estate - migrating from Legacy on-premises systems to a modern cloud-based platform built on Snowflake (hosted on Microsoft Azure).
We're seeking an experienced Lead Data Engineer to provide technical leadership, drive best practices, and establish engineering standards across agile delivery squads.
This role is ideal for a seasoned data professional who combines hands-on expertise in Snowflake and dbt with the ability to guide teams, influence design decisions, and shape a scalable, high-performance data ecosystem.
Key Responsibilities- Lead the design and implementation of scalable, repeatable data pipelines in Snowflake, ensuring efficient ingestion, transformation, and modeling of data.
- Build transformations, calculations, and aggregations using dbt, leveraging reusable and consistent engineering patterns.
- Establish coding standards, data governance practices, and CI/CD workflows for analytics engineering.
- Partner with architecture and platform teams to design robust data models that support downstream analytical and reporting use cases.
Essential:
- Proven experience designing and developing data solutions using Snowflake in a production environment.
- Strong expertise in dbt (data build tool) for transformation, modeling, and pipeline orchestration.
- Solid understanding of data warehousing, ETL/ELT design patterns, and analytics engineering principles.
- Hands-on experience with SQL and performance optimization in Snowflake.