Data Architect

Data Architect

Posted 1 week ago by Project Recruit

Negotiable
Undetermined
Hybrid
London, UK

Summary: The Data Architect role involves designing and evolving data architecture to meet business and engineering needs for a leading global IT services supplier. This hybrid position requires attendance at the London office 3-4 days a week and is a temporary contract lasting over 6 months. The role emphasizes cloud-native data solutions on AWS and collaboration with various stakeholders. Candidates should possess extensive experience in data architecture and engineering, particularly with Snowflake and AWS services.

Key Responsibilities:

  • Design and evolve data architecture aligned with business, analytical, and engineering needs.
  • Lead end to end architecture for cloud native data solutions on AWS.
  • Define standards, patterns, and best practices for data modelling, data flows, and platform usage.
  • Develop scalable Snowflake based data architectures including warehouses, databases, schemas, roles, and secure data sharing.
  • Architect and optimize ETL/ELT pipelines using services such as:
    • DBT, AWS Glue, Lambda, Kinesis, S3, EMR.
  • Design high performance Snowflake data models (3NF, dimensional, Data Vault, etc.).
  • Optimize query performance, clustering, micro-partitioning, and resource consumption.
  • Work closely with Data Engineers, Analysts, Product Managers, and Business Stakeholders.
  • Provide architectural guidance and mentorship across data and engineering teams.
  • Translate business needs into scalable data solutions.

Key Skills:

  • 6+ years in Data Architecture, Data Engineering, and related fields with a similar remit to the descriptions here
  • 2+ years' experience working as an independent contractor
  • Strong grasp of data modelling, data warehousing concepts and performance optimization techniques
  • Hands-on expertise with Snowflake (architecture, performance tuning, security, Snowpipe, streams/tasks).
  • Strong experience with AWS cloud data ecosystem, ideally including: S3, Glue, Lambda, Redshift, EMR, Kinesis, IAM, CloudWatch.
  • Strong SQL skills and proficiency in Python, DBT
  • Understanding of data governance frameworks (eg, Collibra, Alation is a plus).
  • Hands-on exposure to cloud platforms, especially AWS
  • Experience working in agile teams

Salary (Rate): undetermined

City: London

Country: UK

Working Arrangements: hybrid

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Data Architect

Our client, a leading global supplier for IT services, requires Data Architect to be based at their client's office in London, UK.

This is a hybrid role - you can work remotely in the UK and attend the London office 3-4 days per week

This is a 6+ month temporary contract to start ASAP

Day rate: Competitive Market rate

Key Responsibilities

  • Design and evolve data architecture aligned with business, analytical, and engineering needs.
  • Lead end to end architecture for cloud native data solutions on AWS.
  • Define standards, patterns, and best practices for data modelling, data flows, and platform usage.
  • Develop scalable Snowflake based data architectures including warehouses, databases, schemas, roles, and secure data sharing.
  • Architect and optimize ETL/ELT pipelines using services such as:
    • DBT, AWS Glue, Lambda, Kinesis, S3, EMR.
  • Design high performance Snowflake data models (3NF, dimensional, Data Vault, etc.).
  • Optimize query performance, clustering, micro-partitioning, and resource consumption.
  • Work closely with Data Engineers, Analysts, Product Managers, and Business Stakeholders.
  • Provide architectural guidance and mentorship across data and engineering teams.
  • Translate business needs into scalable data solutions.

Key Requirements

  • 6+ years in Data Architecture, Data Engineering, and related fields with a similar remit to the descriptions here
  • 2+ years' experience working as an independent contractor
  • Strong grasp of data modelling, data warehousing concepts and performance optimization techniques
  • Hands-on expertise with Snowflake (architecture, performance tuning, security, Snowpipe, streams/tasks).
  • Strong experience with AWS cloud data ecosystem, ideally including: S3, Glue, Lambda, Redshift, EMR, Kinesis, IAM, CloudWatch.
  • Strong SQL skills and proficiency in Python, DBT
  • Understanding of data governance frameworks (eg, Collibra, Alation is a plus).
  • Hands-on exposure to cloud platforms, especially AWS
  • Experience working in agile teams

Due to the volume of applications received, unfortunately we cannot respond to everyone.

If you do not hear back from us within 7 days of sending your application, please assume that you have not been successful on this occasion.