Negotiable
Undetermined
Undetermined
Cambridge, England, United Kingdom
Summary: The role of GCP Data Engineer involves designing, developing, and maintaining data pipelines using Google Cloud Platform services, particularly Cloud Composer and Airflow. The ideal candidate will have a strong background in data transformation using DBT, SQL, and Python, and will work collaboratively with data teams to ensure data quality and accessibility. This position is suited for individuals who excel in fast-paced environments and are passionate about building scalable data solutions.
Key Responsibilities:
- Design, develop, and maintain robust data pipelines using GCP services such as Cloud Composer and Airflow.
- Implement data transformation workflows using DBT with SQL and Python.
- Collaborate with data analysts, scientists, and other engineers to ensure data quality and accessibility.
- Manage version control and CI/CD pipelines using GitLab.
- Optimize data workflows for performance, scalability, and reliability.
- Monitor and troubleshoot data pipeline issues and ensure timely resolution.
Key Skills:
- 5–7 years of experience as a Data Engineer or in a similar role.
- Strong hands-on experience with GCP services, especially Cloud Composer and Airflow.
- Proficiency in DBT, SQL, and Python for data transformation and scripting.
- Experience with GitLab for version control and CI/CD.
- Solid understanding of data architecture, ETL/ELT processes, and data warehousing concepts.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration abilities.
Salary (Rate): undetermined
City: Cambridge
Country: United Kingdom
Working Arrangements: undetermined
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
About the Role: We are seeking a skilled and motivated Data Engineer with strong expertise in Google Cloud Platform (GCP) services, including Cloud Composer and Airflow , as well as proficiency in DBT (SQL & Python) and GitLab . This role is ideal for someone who thrives in a fast-paced environment and is passionate about building scalable data pipelines and solutions.
Key Responsibilities:
- Design, develop, and maintain robust data pipelines using GCP services such as Cloud Composer and Airflow.
- Implement data transformation workflows using DBT with SQL and Python.
- Collaborate with data analysts, scientists, and other engineers to ensure data quality and accessibility.
- Manage version control and CI/CD pipelines using GitLab.
- Optimize data workflows for performance, scalability, and reliability.
- Monitor and troubleshoot data pipeline issues and ensure timely resolution.
Required Skills & Qualifications:
- 5–7 years of experience as a Data Engineer or in a similar role.
- Strong hands-on experience with GCP services, especially Cloud Composer and Airflow.
- Proficiency in DBT, SQL, and Python for data transformation and scripting.
- Experience with GitLab for version control and CI/CD.
- Solid understanding of data architecture, ETL/ELT processes, and data warehousing concepts.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration abilities.
Preferred Qualifications:
- GCP certification(s) in data engineering or related areas.
- Experience working in agile development environments.
- Familiarity with other cloud platforms or data tools is a plus.