£410 Per day
Inside
Hybrid
Knutsford
Summary: The ML Data Engineer role involves developing and managing AWS-based machine learning pipelines and MLOps, focusing on scalable data workflows and the AI lifecycle in production. The position requires expertise in cloud-native deployment and collaboration with data scientists and engineers. The role is hybrid, based in Knutsford, and offers a competitive daily rate. The ideal candidate will have a strong background in data engineering and machine learning technologies.
Key Responsibilities:
- Build and maintain robust data pipelines and ML workflows on AWS
- Develop and deploy machine learning models using SageMaker and MLOps tools
- Implement CI/CD pipelines for automated testing and deployment
- Create lightweight front-end interfaces for model interaction and visualization
- Monitor model performance and ensure reliability in production environments
- Collaborate with data scientists and engineers to streamline the AI lifecycle
Key Skills:
- AWS Data Engineering: ECS, SageMaker, cloud-native data pipelines
- ML Engineering & MLOps: MLflow, Airflow, Docker, Kubernetes
- CI/CD & DevOps: GitLab, Jenkins, automated deployment workflows
- AI Lifecycle Management: Model training, deployment, monitoring
- Front-End Development: HTML, Streamlit, Flask (for lightweight dashboards and interfaces)
- Cloud Model Deployment: Experience deploying and monitoring models in AWS
- Programming & Big Data: Python, PySpark, familiarity with big data ecosystems
- RESTful APIs: Integration of backend services and model endpoints
Salary (Rate): £410 per day
City: Knutsford
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: inside IR35
Seniority Level: Mid-Level
Industry: IT
Role Title: ML Data Engineer
Start Date: ASAP
End Date: EOY
Location: Knutsford (Hybrid)
Rate: £410p/d via Umbrella
Role Description:
Overview
We are seeking a highly capable Data & ML Engineer with strong experience in AWS-based machine learning pipelines, MLOps, and cloud-native deployment. This role focuses on building scalable data workflows, deploying ML models, and managing the full AI lifecycle in production environments.
Key Skills & Technologies
Primary Skills:
- AWS Data Engineering: ECS, SageMaker, cloud-native data pipelines
- ML Engineering & MLOps: MLflow, Airflow, Docker, Kubernetes
- CI/CD & DevOps: GitLab, Jenkins, automated deployment workflows
- AI Lifecycle Management: Model training, deployment, monitoring
- Front-End Development: HTML, Streamlit, Flask (for lightweight dashboards and interfaces)
- Cloud Model Deployment: Experience deploying and monitoring models in AWS
- Programming & Big Data: Python, PySpark, familiarity with big data ecosystems
Secondary Skills:
- RESTful APIs: Integration of backend services and model endpoints
Responsibilities
- Build and maintain robust data pipelines and ML workflows on AWS
- Develop and deploy machine learning models using SageMaker and MLOps tools
- Implement CI/CD pipelines for automated testing and deployment
- Create lightweight front-end interfaces for model interaction and visualization
- Monitor model performance and ensure reliability in production environments
- Collaborate with data scientists and engineers to streamline the AI lifecycle
