£400 Per day
Inside
Hybrid
Manchester Area, United Kingdom
Summary: The AWS DevOps/Data Engineer role focuses on deploying and managing cloud infrastructure, specifically utilizing Astronomer Airflow and AccelData environments. The position requires collaboration with cross-functional teams to ensure high availability and performance of data systems while implementing best practices for cloud security. The role is contract-based, requiring in-office presence three days a week in the Manchester area.
Key Responsibilities:
- Deploy comprehensive cloud infrastructure for various products, including Astronomer Airflow and AccelData environments.
- Facilitate cross-functional integration between vendor products and other systems, such as data lakes, storage, and compute services.
- Establish best practices for cloud security, scalability, and performance.
- Manage and configure vendor product deployments, ensuring the setup and maintenance of environments.
- Ensure high availability, scalability, and fault tolerance of Airflow clusters.
- Implement monitoring, alerting, and logging for Airflow and related components.
- Perform upgrades and patches for platform-related components.
- Oversee capacity planning, resource allocation, and optimization of Airflow workers.
- Maintain and configure integrations with source control systems (e.g., GitHub, GitLab) for version control.
- Collaborate with cloud providers (e.g., AWS) for pipeline integration and scaling requirements.
- Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.
- Develop infrastructure for optimal extraction, transformation, and loading of data from various sources using AWS and SQL technologies.
- Work with stakeholders, including design, product, and executive teams, to address platform-related technical issues.
- Build analytical tools to leverage the data pipeline, providing actionable insights into key business performance metrics, such as operational efficiency and customer acquisition.
Key Skills:
- Experience with AWS cloud services and infrastructure management.
- Proficiency in deploying and managing Airflow and related data engineering tools.
- Strong understanding of cloud security, scalability, and performance best practices.
- Experience with source control systems like GitHub and GitLab.
- Knowledge of SQL technologies for data extraction, transformation, and loading.
- Ability to collaborate with cross-functional teams and stakeholders.
- Experience in monitoring, alerting, and logging for cloud environments.
- Capacity planning and resource optimization skills.
- Strong problem-solving and process improvement capabilities.
Salary (Rate): £400 daily
City: Manchester Area
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: inside IR35
Seniority Level: undetermined
Industry: IT
Job Title: AWS DevOps/Data Engineer - AirFlow
Location: Manchester Area - 3 days per week in the office
Salary/Rate: Up to £400 per day INSIDE IR35
Start Date: 03/11/2025
Job Type: Contract
Company Introduction
We have an exciting opportunity now available with one of our sector-leading financial services clients! They are currently looking for a skilled AWS DevOps/Data Engineer to join their team for an initial contract until the end of the year.
Job Responsibilities/Objectives
- Deploy comprehensive cloud infrastructure for various products, including Astronomer Airflow and AccelData environments.
- Facilitate cross-functional integration between vendor products and other systems, such as data lakes, storage, and compute services.
- Establish best practices for cloud security, scalability, and performance.
- Manage and configure vendor product deployments, ensuring the setup and maintenance of environments.
- Ensure high availability, scalability, and fault tolerance of Airflow clusters.
- Implement monitoring, alerting, and logging for Airflow and related components.
- Perform upgrades and patches for platform-related components.
- Oversee capacity planning, resource allocation, and optimization of Airflow workers.
- Maintain and configure integrations with source control systems (e.g., GitHub, GitLab) for version control.
- Collaborate with cloud providers (e.g., AWS) for pipeline integration and scaling requirements.
- Identify, design, and implement internal process improvements, including re-designing infrastructure for greater scalability, optimising data delivery, and automating manual processes.
- Develop infrastructure for optimal extraction, transformation, and loading of data from various sources using AWS and SQL technologies.
- Work with stakeholders, including design, product, and executive teams, to address platform-related technical issues.
- Build analytical tools to leverage the data pipeline, providing actionable insights into key business performance metrics, such as operational efficiency and customer acquisition