GCP Data Engineer

GCP Data Engineer

Posted 1 day ago by Bodhi

£63 Per hour
Inside
Remote
Surrey, England, United Kingdom

Summary: The GCP Data Engineer role involves designing, implementing, and optimizing data solutions within the Google Cloud Platform ecosystem. The position requires collaboration with cross-functional teams to ensure effective data integration and governance, while also focusing on performance tuning and cloud infrastructure management. The successful candidate will work within a newly established CDM Operations Team, supporting marketing activities across Europe. This is a 12-month contract position with a pay rate of £480–£510 per day, classified as inside IR35.

Key Responsibilities:

  • Design scalable, efficient data solutions using BigQuery and other GCP tools.
  • Build, maintain, and optimize ETL/ELT pipelines using Dataflow, Apache Beam, and Cloud Composer.
  • Optimize BigQuery queries and storage structures for performance and cost efficiency.
  • Configure and manage GCP services such as Cloud Storage, Pub/Sub, and IAM.
  • Implement data quality and governance frameworks.
  • Collaborate with data analysts, engineers, and business stakeholders.

Key Skills:

  • Exceptional English communication skills.
  • Strong proficiency in BigQuery and SQL.
  • Hands-on experience with GCP services like Cloud Storage and Dataflow.
  • Familiarity with data pipeline frameworks such as Apache Beam.
  • Strong programming skills in Python or Java.
  • Proven experience designing and managing cloud-based data solutions.
  • Excellent analytical and problem-solving abilities.
  • Google Cloud Certified: Professional Data Engineer or Associate Cloud Engineer.

Salary (Rate): £60.00/hr

City: Surrey

Country: United Kingdom

Working Arrangements: remote

IR35 Status: inside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

JOB TITLE: GCP Engineer

LOCATION: UK REMOTE: with Occasional Office Visits to Surrey office

PAY RATE:: £480–£510 p/day Inside IR35 12 Month Contract

SUBSIDIARY / DEPARTMENT OVERVIEW: The organisation is a globally recognised leader in technology and innovation, delivering advanced digital products and solutions used by millions of people worldwide. With a strong focus on cutting-edge technologies and continuous improvement, the company drives digital transformation across multiple markets. The global software solutions and IT services division plays a key role in delivering enterprise-scale digital capabilities. This position sits within a newly established CDM Operations Team, supporting marketing activities across more than 20 European countries.

PURPOSE OF THE JOB: The organisation is seeking a skilled Google Cloud Data Engineer to design, implement, and optimise data solutions within the Google Cloud Platform (GCP) ecosystem. The successful candidate will collaborate with cross-functional teams to ensure effective data integration, governance, and analytics capabilities.

KEY ACCOUNTABILITIES

  • Data Architecture & Solution Design
  • Design scalable, efficient data solutions using BigQuery and other GCP tools to support business intelligence and analytics requirements.
  • Work closely with stakeholders to gather data requirements and translate them into technical designs.
  • Data Integration & Pipelines
  • Build, maintain, and optimise ETL/ELT pipelines using tools such as Dataflow, Apache Beam, and Cloud Composer.
  • Integrate multiple data sources, including APIs, relational databases, and streaming platforms, into BigQuery.
  • BigQuery Optimisation & Performance Tuning
  • Optimise BigQuery queries and storage structures to ensure high performance and cost efficiency.
  • Implement partitioning and clustering strategies to enhance query performance.
  • Cloud Infrastructure Management
  • Configure and manage GCP services such as Cloud Storage, Pub/Sub, and IAM to ensure secure and reliable data operations.
  • Apply best practices in cloud security and compliance.
  • Data Governance & Quality
  • Implement data quality and governance frameworks to ensure accuracy, consistency, and availability of data.
  • Establish monitoring and alerting mechanisms for pipelines and systems to proactively prevent and resolve issues.
  • Collaboration & Support
  • Partner with data analysts, engineers, and business stakeholders to enable efficient data processing.
  • Provide technical guidance and support to team members.

KEY LIAISONS

  • Data Engineering Team
  • Adobe Team
  • European Regional Office
  • Headquarters

DIMENSIONS

Maintain strong working relationships with all key stakeholders. Support and align activities with both marketing and operations teams.

SKILLS AND EXPERIENCE

Essential Language Skills

  • Exceptional English communication skills, as the role involves collaboration with global teams.

Technical Expertise

  • Strong proficiency in BigQuery and SQL, including data modelling and query optimisation.
  • Hands-on experience with GCP services such as Cloud Storage, Cloud Composer, Dataflow, and Pub/Sub.
  • Familiarity with data pipeline frameworks such as Apache Beam and Airflow.
  • Strong programming skills in Python or Java for data processing and scripting.
  • Knowledge of shell scripting and cloud automation.
  • Proven experience designing and managing cloud-based data solutions.
  • Strong background in developing and maintaining ETL/ELT pipelines.
  • Demonstrated ability to optimise BigQuery performance and manage cloud costs effectively.
  • Experience implementing partitioning, clustering, and materialised views.

Soft Skills

  • Excellent analytical and problem-solving abilities.
  • Strong communication skills, with the ability to explain technical concepts to non-technical stakeholders.
  • Ability to work collaboratively in a fast-paced, evolving environment.

Certifications

  • Google Cloud Certified: Professional Data Engineer or Associate Cloud Engineer.

Desired

  • Experience with Amazon Redshift for managing and optimising data warehouse solutions across multi-cloud environments.
  • Experience with Microsoft Azure tools, particularly Azure Data Factory (ADF).

CHALLENGE: The organisation operates within a fast-paced and evolving environment where processes and procedures frequently change. The successful candidate must stay up to date with technological developments and assess their potential business impact.

Note: This job description outlines the primary responsibilities of the role but does not represent an exhaustive list of duties. It is intended to clarify expectations between the Manager and the employee and may be amended in line with evolving business requirements.