Negotiable
Undetermined
Hybrid
Cambridgeshire, England, United Kingdom
Summary: We are hiring a Data Engineer based in London (Cambridge) for a 4-month contract with the possibility of extension. The ideal candidate will have 5-7 years of experience and strong expertise in Google Cloud Platform services, particularly Cloud Composer and Airflow, along with proficiency in DBT, SQL, and Python. This role focuses on building scalable data pipelines and collaborating with various teams to ensure data quality and accessibility.
Key Responsibilities:
- Design, develop, and maintain robust data pipelines using GCP services such as Cloud Composer and Airflow.
- Implement data transformation workflows using DBT with SQL and Python.
- Collaborate with data analysts, scientists, and other engineers to ensure data quality and accessibility.
- Manage version control and CI/CD pipelines using GitLab.
- Optimize data workflows for performance, scalability, and reliability.
- Monitor and troubleshoot data pipeline issues and ensure timely resolution.
Key Skills:
- 5–7 years of experience as a Data Engineer or in a similar role.
- Strong hands-on experience with GCP services, especially Cloud Composer and Airflow.
- Proficiency in DBT, SQL, and Python for data transformation and scripting.
- Experience with GitLab for version control and CI/CD.
- Solid understanding of data architecture, ETL/ELT processes, and data warehousing concepts.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration abilities.
Salary (Rate): undetermined
City: Cambridge
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
Greetings We are Hiring Job Title: Data Engineer Location: London (Cambridge) 3 days from office weekly Experience: 5–7 Years 4 months contract with possible extension About the Role: We are seeking a skilled and motivated Data Engineer with strong expertise in Google Cloud Platform (GCP) services, including Cloud Composer and Airflow , as well as proficiency in DBT (SQL & Python) and GitLab . This role is ideal for someone who thrives in a fast-paced environment and is passionate about building scalable data pipelines and solutions.
Key Responsibilities:
- Design, develop, and maintain robust data pipelines using GCP services such as Cloud Composer and Airflow.
- Implement data transformation workflows using DBT with SQL and Python.
- Collaborate with data analysts, scientists, and other engineers to ensure data quality and accessibility.
- Manage version control and CI/CD pipelines using GitLab.
- Optimize data workflows for performance, scalability, and reliability.
- Monitor and troubleshoot data pipeline issues and ensure timely resolution.
Required Skills & Qualifications:
- 5–7 years of experience as a Data Engineer or in a similar role.
- Strong hands-on experience with GCP services, especially Cloud Composer and Airflow.
- Proficiency in DBT, SQL, and Python for data transformation and scripting.
- Experience with GitLab for version control and CI/CD.
- Solid understanding of data architecture, ETL/ELT processes, and data warehousing concepts.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration abilities.
Preferred Qualifications:
- GCP certification(s) in data engineering or related areas.
- Experience working in agile development environments.
- Familiarity with other cloud platforms or data tools is a plus.