£358 Per day
Inside
Hybrid
Manchester (Hybrid) 2 days onsite, UK
Summary: The role of Senior Data Engineer requires over 7 years of experience in data engineering, focusing on architecting and optimizing data pipelines on AWS. The position involves mentoring junior engineers and implementing best practices in a collaborative development environment. The candidate will work with various AWS services and tools to deliver scalable data solutions. This is a hybrid role based in Manchester, requiring two days of on-site work per week.
Key Responsibilities:
- Design, develop, and maintain data pipelines across diverse source systems (Oracle, MSO, Qlik Replicate/Compose) and target platforms (AWS Redshift)
- Architect robust and scalable ETL/ELT solutions using AWS Glue, Lambda, SQS, and other AWS services
- Implement, automate, and optimise CI/CD pipelines via GitHub Actions, supporting trunk-based workflows
- Manage infrastructure deployments through Terraform Blueprints and config YAML
- Develop and execute unit, integration, and smoke tests using Docker and pytest
- Monitor, troubleshoot, and resolve data pipeline issues, applying advanced debugging and error handling techniques
- Review code for peers, mentor junior engineers, and uphold best practices in engineering and documentation
- Ensure secure access and secrets management using RBAC and GitHub Secret Manager
- Prepare and review release documentation, including Release Notes and checklists, with clear separation of features, bug fixes, and known issues (JIRA references)
- Participate in manual and automated release and deployment processes across Dev, Test, Pre-Prod, and Prod environments
- Champion continuous improvement in testing, deployment, and delivery processes
Key Skills:
- 7+ years hands-on experience in Python and PySpark for data engineering applications
- Deep expertise in AWS services, especially Redshift, Glue, Lambda, Step Functions
- Proficient in building CI/CD workflows using GitHub Actions
- Orchestration experience with Airflow (or similar tools such as Control-M)
- Strong command of SQL, including performance optimisation and data modelling
- Solid understanding of infrastructure as code, preferably with Terraform
- Experience with Docker for development and testing environments
- Knowledge of Oracle and Qlik Replicate/Compose for source data systems
- Familiarity with RBAC models and secrets management (GitHub Secret Manager)
- Adherence to coding standards such as PEP8; proven record of clean, maintainable code
- Unit testing experience using pytest
- Experience in collaborative ways of working, including pair programming and peer reviews
- Comfortable with Visual Studio Code as primary IDE
Salary (Rate): £358/day
City: Manchester
Country: UK
Working Arrangements: hybrid
IR35 Status: inside IR35
Seniority Level: Senior
Industry: IT
We are a Global recruitment specialist that provides support to the clients across EMEA, APAC, US and Canada. We have an excellent job opportunity for you.
Role Title: Data Engineer
Location: Manchester Hybrid - 2 days per week
Contract duration: 31/12/2025
Role Description:
Senior Data Engineer (7+ Years Experience)
Join our dynamic data engineering team as a Senior Data Engineer, leveraging your extensive experience (7+ years) to architect, deliver, and optimize modern data pipelines on AWS. You will be instrumental in developing scalable solutions, driving best practices, and mentoring junior engineers within a collaborative trunk-based development environment.
Required Skills & Experience
7+ years hands-on experience in Python and PySpark for data engineering applications
Deep expertise in AWS services, especially Redshift, Glue, Lambda, Step Functions
Proficient in building CI/CD workflows using GitHub Actions
Orchestration experience with Airflow (or similar tools such as Control-M)
Strong command of SQL, including performance optimisation and data modelling
Solid understanding of infrastructure as code, preferably with Terraform
Experience with Docker for development and testing environments
Knowledge of Oracle and Qlik Replicate/Compose for source data systems
Familiarity with RBAC models and secrets management (GitHub Secret Manager)
Adherence to coding standards such as PEP8; proven record of clean, maintainable code
Unit testing experience using pytest
Experience in collaborative ways of working, including pair programming and peer reviews
Comfortable with Visual Studio Code as primary IDE
Ways of Working
Drive trunk-based development: collaborate in a single shared branch ( main') for frequent, small integrations
Champion short-lived feature branches and rapid merge cycles (1-2 days)
Lead or participate in pair programming sessions to share knowledge and improve code quality
Enforce daily integration, automated testing, and code reviews prior to merging
Promote disciplined use of automated CI/CD workflows and maintain high standards for code, documentation, and testing
Support manual gated approvals for Pre-Prod and Prod deployments until automated confidence is achieved
Responsibilities
Design, develop, and maintain data pipelines across diverse source systems (Oracle, MSO, Qlik Replicate/Compose) and target platforms (AWS Redshift)
Architect robust and scalable ETL/ELT solutions using AWS Glue, Lambda, SQS, and other AWS services
Implement, automate, and optimise CI/CD pipelines via GitHub Actions, supporting trunk-based workflows
Manage infrastructure deployments through Terraform Blueprints and config YAML
Develop and execute unit, integration, and smoke tests using Docker and pytest
Monitor, troubleshoot, and resolve data pipeline issues, applying advanced debugging and error handling techniques
Review code for peers, mentor junior engineers, and uphold best practices in engineering and documentation
Ensure secure access and secrets management using RBAC and GitHub Secret Manager
Prepare and review release documentation, including Release Notes and checklists, with clear separation of features, bug fixes, and known issues (JIRA references)
Participate in manual and automated release and deployment processes across Dev, Test, Pre-Prod, and Prod environments
Champion continuous improvement in testing, deployment, and delivery processes
Preferred Qualifications
Bachelor's or Master's degree in Computer Science, Engineering, or related field
Certifications in AWS or related cloud technologies
Demonstrated success in trunk-based development and release management at scale
Strong communication and stakeholder management skills
What We Offer
Work with cutting-edge technologies in an agile, modern environment
Opportunity to lead and influence engineering culture and practices
Support for professional development and cloud certification
Collaborative, growth-oriented team culture
If you are interested in this position and would like to learn more, please send through your CV and we will get in touch with you as soon as possible. Please note, candidates are often Shortlisted within 48 hours.