Senior Snowflake Implementation Consultant - Contractor Position

Senior Snowflake Implementation Consultant - Contractor Position

Posted 1 day ago by Agent3 Group

Negotiable
Undetermined
Undetermined
Greater London, England, United Kingdom

Summary: The role of Senior Data Engineer focuses on leading the technical delivery of a Snowflake migration project, transitioning from an in-house data warehouse to Snowflake as the core infrastructure. The position requires extensive experience with Snowflake, MongoDB, BigQuery, AWS, and GCP, along with a strong background in ETL and data warehouse technologies. The engineer will collaborate with various stakeholders to design, implement, and document the migration strategy while ensuring a maintainable solution for ongoing support. This is a time-bound, delivery-focused project role within a marketing environment.

Key Responsibilities:

  • Design and implement a Snowflake environment in line with an agreed migration strategy.
  • Plan how to map nested JSON documents into Snowflake, documenting trade-offs and recommendations.
  • Execute the migration of historical marketing data from AWS/Mongo to Snowflake.
  • Build the connection between the Seer frontend and Snowflake, adapting the existing Seer Data API.
  • Implement a security model that maps Seer’s user management system to Snowflake’s security policies.
  • Review and migrate ELT pipelines to ingest data from 3rd-party marketing APIs into Snowflake.
  • Contribute to and implement development standards for CI/CD and version control.
  • Design, develop, and maintain scalable data pipelines and ETL processes.
  • Implement data quality checks and monitoring frameworks, ensuring knowledge transfer to the internal team.
  • Guide the project implementation phases, including discovery, migration, BI integration, and documentation.

Key Skills:

  • Advanced Snowflake qualifications.
  • Extensive experience in data engineering or analytics engineering, preferably in marketing analytics.
  • Bachelor’s or Master’s degree in Computer Science, Data Engineering, or Statistics.
  • Strong programming skills in Python and SQL, with familiarity in Java or Scala.
  • Proficiency in ETL tools and frameworks like Apache Airflow and dbt.
  • Experience with data visualization tools such as Tableau and Power BI.
  • Familiarity with cloud platforms like AWS and Google Cloud.
  • Solid understanding of database design principles and query optimization techniques.
  • Knowledge of version control systems and experience with CI/CD pipelines.
  • Excellent problem-solving and communication skills.

Salary (Rate): undetermined

City: Greater London

Country: United Kingdom

Working Arrangements: undetermined

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

We are seeking a Senior Data Engineer with extensive infrastructure implementation experience, especially with Snowflake and connected applications to lead technical delivery of our Snowflake migration project. To help us migrate from an in-house developed data warehouse to Snowflake as our core data infrastructure, supporting multiple client and 1P integrations as well as our own reporting visualisation solution, Seer. You will have current advanced Snowflake qualifications, a track record of successful implementations and migrations and experience with MongoDB, BigQuery, AWS and GCP, ETL and data warehouse technologies. Experience working with advertising and marketing platform data would be a considerable advantage. In this role, you will leverage your knowledge of Snowflake and its application in a marketing environment to interpret, advise on and plan infrastructure implementation requirements. Provide guidance and recommendations, and develop a comprehensive, documented implementation plan that you will then work with the team to deliver. You will work with our Senior Director - Infrastructure and Delivery, the senior Engineer who developed much of the existing infrastructure and our team of data and software engineers, to drive the migration and ensure the team is left with a well-documented, maintainable solution that they can support and manage on an ongoing basis. This is a focused project role. This role is delivery-focused and time-bound. Working alongside our Senior Director - Infrastructure and Delivery. The long-term platform strategy, operating model, and product roadmap will remain with the Senior Director of Infrastructure & Delivery.

Key Responsibilities:

  • Architecture & Migration (MongoDB/AWS/BigQuery to Snowflake) Design & Strategy: Design and implement a Snowflake environment in line with an agreed migration strategy. Plan how to map nested JSON documents into Snowflake (using VARIENT vs. normalized tables) to balance query performance with schema flexibility, documenting trade-offs and recommendations for internal review. Solution presentation and collaboration during the planning stages, regularly present architectural decisions and trade-offs to key stakeholders, allowing for input and shared decision-making on key points. Infrastructure as Code: specific setup of Snowflake warehouses, roles, and security policies based on defined security and governance standards using Terraform or similar tools. Data Migration: Execute the migration of historical marketing data (TB scale) from AWS/Mongo to Snowflake, ensuring high data parity to run platforms concurrently, with validation agreed with stakeholders.
  • Application Integration (The "Seer" Connection) Backend Engineering: Build the "connective tissue" between the Seer frontend and Snowflake. This may involve: Adapt and extend the existing Seer Data API (FastAPI) to support Snowflake as a backend, alongside or in place of MongoDB where required (with support from the APIs developer). This includes updating query patterns, performance optimisation, and ensuring backward compatibility during migration. Optimizing Snowflake query performance (Clustering, Materialized Views) to ensure fast (cached) response times for interactive user dashboards. Security Bridge: In collaboration with the platform owner, implement a security model that maps Seer’s existing user management system to Snowflake’s Row-Level Security (RLS) or Dynamic Data Masking policies, ensuring clients only see their own data.
  • Data Engineering & Standards Pipeline Development: Review and migrate ELT pipelines (using dbt, Airflow, or where needed Fivetran) to ingest data from 3rd-party marketing APIs (LinkedIn, Google Ads, CRM) into the new Snowflake backend. Assess and implement Snowflake Openflow or direct data sharing options where available, documenting impacts on existing Fivetran and custom connector usage. Best Practices: Contribute to and implement agreed development standards for CI/CD, version control (Git). Orchestration will initially remain within the existing Seer backend unless otherwise agreed. Evaluation of alternative orchestrators (e.g. Dagster) may be documented but is not a primary delivery objective. Integration consultancy, guidance and establishing best practice: Design, develop, and maintain scalable data pipelines and ETL processes to ensure timely and accurate data availability. Implement data quality checks, validation, and monitoring frameworks, with handover to the internal team. Develop and optimise data storage to support analytics and reporting needs. Drive automation of the most common recurring data tasks and reporting processes to improve team productivity and minimise manual intervention. Implement security and lifecycle controls in line with existing compliance and governance frameworks Support the team and train them on key implementation specifics points to ensure knowledge transfer, allowing them to support Snowflake and related infrastructure with confidence.

Project plan overview (subject to confirmation) It will be a key part of your role to guide us on the finder points of the project implementation; however, we envisage a 5-7 month integration Months 1-2: Discovery & Foundation. Auditing data, setting up Snowflake environments, building connection structures and workflow, including Snowflake native ETL utilisation (vs custom, Fivetran etc). Establishing naming conventions. Months 3-4: Migration & Modelling. Moving historical data, writing dbt models to clean and optimise data. Month 5: BI Integration. Connecting front-end tools Seer (and dashboards Tableau/PowerBI), testing data accuracy with stakeholders / validating with frontend users and optimising. Month 6: Handoff & Documentation. Training internal teams, documenting the code, monitoring and addressing any performance issues.

Qualifications: Snowflake qualifications in relevant “Advanced” courses Extensive experience in a data engineering or analytics engineering role, with exposure to marketing analytics strongly preferred. Bachelor’s or Master’s degree in Computer Science, Data Engineering, Statistics, or Strong programming skills, particularly in Python, SQL, and familiarity with other languages like Java or Scala. Proficiency in ETL tools and frameworks (e.g., Apache Airflow, dbt) and data integration techniques. Experience with data visualisation tools (e.g., Tableau, Power BI) and an understanding of how to enable analysts through clean and usable data. Familiarity with cloud-based platforms (e.g., AWS, Google Cloud, Azure) and data warehouse technologies (e.g., Snowflake, Redshift, BigQuery). Solid understanding of database design principles and query optimisation techniques. Knowledge of version control systems (e.g., Git) and experience with CI/CD pipelines. Excellent problem-solving capabilities, with a focus on building scalable, reliable solutions. Strong communication skills to effectively collaborate with technical and non-technical stakeholders.

Budget and application: This project will have a fixed budget and timeline, agreed with the consultant. We would welcome a cost estimate based on the details and assumptions set out in the document. For the applicants that we shortlist, there will be an opportunity to discuss the plan in more detail and revise the plan, time-scales and budget.