Senior GCP Data Engineer (Outside IR35 | Contract)

Senior GCP Data Engineer (Outside IR35 | Contract)

Posted Today by Casamentero Consulting LLP

Negotiable
Outside
Undetermined
London Area, United Kingdom

Summary: The Senior GCP Data Engineer will design and implement scalable data platforms on Google Cloud for a prominent banking and financial services client. This role focuses on developing both batch and real-time data pipelines, ensuring high performance and security in data solutions. The engineer will collaborate on data architecture and utilize various tools and technologies to optimize data systems. The position is contract-based and offers the potential for extension.

Key Responsibilities:

  • Design and build batch & real-time data pipelines on GCP
  • Work with BigQuery, Dataflow, Cloud Storage, Spanner
  • Develop real-time streaming solutions (Kafka / Apache Beam)
  • Implement Infrastructure as Code using Terraform
  • Deploy workloads on Kubernetes (GKE)
  • Build and manage CI/CD pipelines (Jenkins / Spinnaker)
  • Collaborate on data modelling and architecture
  • Ensure systems are secure, scalable, and cost-optimized

Key Skills:

  • 6+ years of overall experience in Data Engineering
  • 3+ years of hands-on experience in Google Cloud Platform (GCP)
  • Strong expertise in: BigQuery, Dataflow / Apache Beam, Kafka / Spark / Big Data tools
  • Proficiency in Python / Java / SQL
  • Experience with: Terraform, Kubernetes, CI/CD pipelines
  • Strong understanding of data modelling
  • Nice to Have: DBT experience, Banking/Financial Services domain experience, GCP Certification

Salary (Rate): 350 GBP/Day

City: London

Country: United Kingdom

Working Arrangements: undetermined

IR35 Status: outside IR35

Seniority Level: Senior

Industry: IT

Detailed Description From Employer:

We are hiring a Senior GCP Data Engineer to design and build scalable, high-performance data platforms on Google Cloud for a leading banking/financial services client. This role involves working on large-scale data systems, building both batch and real-time pipelines, and delivering enterprise-grade data solutions.

Key Responsibilities

  • Design and build batch & real-time data pipelines on GCP
  • Work with BigQuery, Dataflow, Cloud Storage, Spanner
  • Develop real-time streaming solutions (Kafka / Apache Beam)
  • Implement Infrastructure as Code using Terraform
  • Deploy workloads on Kubernetes (GKE)
  • Build and manage CI/CD pipelines (Jenkins / Spinnaker)
  • Collaborate on data modelling and architecture
  • Ensure systems are secure, scalable, and cost-optimized

Required Skills

  • 6+ years of overall experience in Data Engineering
  • 3+ years of hands-on experience in Google Cloud Platform (GCP)
  • Strong expertise in: BigQuery Dataflow / Apache Beam Kafka / Spark / Big Data tools
  • Proficiency in Python / Java / SQL
  • Experience with: Terraform Kubernetes CI/CD pipelines
  • Strong understanding of data modelling

Nice to Have

  • DBT experience
  • Banking/Financial Services domain experience
  • GCP Certification

Additional Details

  • Notice Period: Immediate to 1 Month Notice
  • Location: London/Chester, UK
  • Employment type: Contract - 6 Months & Extendable
  • Client: Bank/Financial services (Client details will be shared when the interview is scheduled)
  • Rate: 300-350 GBP/Day (Outside IR35)
  • Visa: Valid UK work permit required

"If you're interested or available, please apply or share your updated CV."