Negotiable
Outside
Hybrid
London Area, United Kingdom
Summary: The Senior Data Engineer (GCP) role involves leading the design, development, and optimization of cloud-based data infrastructure, with a focus on security, scalability, and reliability. The position requires mentoring junior engineers and collaborating with various stakeholders to enable data-driven decision-making. The successful candidate will take technical ownership of key data engineering initiatives and implement best practices in data architecture. This role is hybrid and classified as outside IR35.
Key Responsibilities:
- Lead the design and implementation of secure and scalable data pipelines using ELT/ETL patterns.
- Develop best practices for modular pipeline design, orchestration, and data transformations.
- Mentor junior data engineers and support peer reviews and architecture discussions.
- Architect real-time, batch, and micro-batch data workflows for critical business use cases.
- Utilize modern orchestration tools for scheduling, observability, and lineage tracking.
- Monitor and optimize performance of pipelines, queries, and storage systems.
- Take ownership of data platform infrastructure across Azure and hybrid environments.
- Implement CI/CD workflows and DevOps practices for data pipeline deployments.
- Embed data privacy and security into engineering workflows and ensure compliance with GDPR.
- Design and maintain integrations with ERP systems and external data sources.
- Implement data validation, testing, and anomaly detection frameworks.
- Collaborate with analysts and stakeholders to translate business needs into data solutions.
- Document architecture decisions, data dictionaries, and engineering standards.
Key Skills:
- 7+ years in data engineering with experience in enterprise environments.
- Expert-level knowledge of SQL and Python for data transformation.
- Strong experience with Azure data services (Data Factory, Synapse, Event Hub, ADLS).
- Proven experience building and optimizing data lakehouse architectures.
- Hands-on experience with orchestration tools and data modeling.
- Familiarity with REST/SOAP APIs and event streaming platforms.
- Awareness of security, data protection, and compliance requirements.
- Experience with CI/CD pipelines and DevOps principles.
Salary (Rate): undetermined
City: London
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: outside IR35
Seniority Level: Senior
Industry: IT
Job Ref :- 843 | Senior Data Engineer (GCP) ( Hybrid | GCP Hybrid | Outside IR35) Apply via LinkedIn or Email your CV to:- HR@AGITCONSULTANCY.CO.UK
About the Role
We are looking for an experienced Senior Data Engineer to lead the design, development, and optimisation of our cloud-based data infrastructure. This role will take technical ownership of key data engineering initiatives, mentor junior team members, and ensure the security, scalability, and reliability of our data platform. Working closely with engineering, analytics, and security stakeholders, you will play a critical role in enabling data-driven decision-making across the business.
Key Duties, Responsibilities & Accountabilities
- Data Architecture & Engineering Leadership
- Lead the design and implementation of robust, secure, and scalable data pipelines using ELT/ETL patterns.
- Develop and promote best practices around modular pipeline design, orchestration, and data transformations.
- Serve as a technical mentor to other data engineers and support peer reviews and architecture discussions.
- Advanced Pipeline Design & Optimisation
- Architect real-time, batch, and micro-batch data workflows for business-critical use cases.
- Leverage modern orchestration tools (e.g. Airflow, Azure Data Factory, dbt Cloud) for scheduling, observability, and lineage tracking.
- Monitor and fine-tune performance of pipelines, queries, and storage systems to ensure efficiency and cost control.
- Cloud Infrastructure & Platform Ownership
- Take ownership of data platform infrastructure across Azure (Data Lake, Synapse, Event Hubs, etc.) and/or hybrid environments.
- Implement CI/CD workflows, DevOps practices, and infrastructure-as-code (Terraform, Bicep) for data pipeline deployments.
- Drive initiatives for high availability, disaster recovery, and fault-tolerant design.
- Data Governance, Security & Privacy
- Embed data privacy and security-by-design into all engineering workflows, including data masking, encryption, and access controls.
- Collaborate with Information Security teams to ensure compliance with GDPR and internal data handling policies.
- Lead efforts in data lineage, classification, and audit logging for sensitive data assets.
- Integration & Interoperability
- Design and maintain integrations with ERP systems (e.g., Infor M3, ION), planning tools, and external data sources via APIs or SFTP.
- Define reusable data ingestion frameworks for structured and semi-structured formats (JSON, XML, CSV, Avro, Parquet).
- Data Quality, Testing & Observability
- Implement data validation, testing, and anomaly detection frameworks using tools like dbt tests, Great Expectations, or custom solutions.
- Ensure clear data lineage and documentation from source to consumption.
- Set up observability dashboards and alerts to ensure data pipeline reliability and transparency.
- Collaboration & Delivery
- Partner with analysts, scientists, and stakeholders to translate business needs into scalable data solutions.
- Participate in project planning, estimation, and agile delivery across the data roadmap.
- Document architecture decisions, data dictionaries, and engineering standards.
Knowledge, Skills and Experience
Essential Experience & Skills
- 7+ years in data engineering, with experience in leading technical delivery in enterprise environments.
- Expert-level knowledge of SQL and Python for data transformation and automation.
- Strong experience with Azure data services (Data Factory, Synapse, Event Hub, ADLS).
- Proven experience building and optimising data lakehouse architectures (e.g., Delta Lake, Databricks, Snowflake).
- Hands-on experience with orchestration tools (Airflow, ADF) and data modelling (dimensional/star/snowflake).
- Familiarity with REST/SOAP APIs and event streaming platforms (e.g., Kafka, Azure Event Hub).
- Aware of security, data protection, and compliance requirements (GDPR, data encryption, IAM).
- Experience with CI/CD pipelines, version control, and DevOps principles.
Desirable Experience & Skills
- Experience working with manufacturing or FMCG systems and data (ERP, MES, TPM).
- Familiarity with Microsoft Purview or data cataloging solutions.
- Exposure to Power BI, Tableau or other reporting tools.
- Certifications in Azure Data Engineering (e.g., DP-203), dbt, or cloud architecture.
Equal Opportunity Statement
*AGIT Consultancy is an equal opportunity employer and does not discriminate on the basis of race, color, religion, sex, age, national origin, veteran status, disability, sexual orientation/gender identity, or any other characteristic protected by applicable law.*