Data Engineer - (AWS/MLOps/Python/PySpark/Sagemaker/ECS/Gitlab/CI/CD/Banking/Fintech)

Data Engineer - (AWS/MLOps/Python/PySpark/Sagemaker/ECS/Gitlab/CI/CD/Banking/Fintech)

Posted 1 week ago by GIOS Technology

Negotiable
Undetermined
Hybrid
Knutsford, England, United Kingdom

Summary: The Data Engineer role focuses on developing and optimizing data pipelines to support AI/ML workloads, leveraging AWS and MLOps practices. The candidate will work closely with data scientists and engineers to enhance the machine learning lifecycle and ensure effective deployment and monitoring of models. Proficiency in big data ecosystems and CI/CD pipelines is essential for this position. The role is based in Knutsford and offers a hybrid working arrangement.

Key Responsibilities:

  • Design, develop, and optimize data pipelines to support AI/ML workloads.
  • Build and manage solutions using AWS services including ECS and Sagemaker.
  • Implement MLOps practices with tools such as MLflow, Airflow, Docker, and Kubernetes.
  • Collaborate with data scientists and engineers to streamline the machine learning lifecycle.
  • Integrate backend services via RESTful APIs and support front-end frameworks (HTML, Streamlit, Flask).
  • Ensure CI/CD best practices using GitLab, Jenkins, and related tools.

Key Skills:

  • AWS Data Engineering
  • ML Engineering
  • MLOps
  • ECS
  • Sagemaker
  • GitLab
  • Jenkins
  • CI/CD
  • AI Lifecycle
  • Front-end (HTML, Streamlit, Flask)
  • Cloud model deployment/monitoring
  • Python
  • PySpark
  • Big Data ecosystems
  • MLflow
  • Airflow
  • Docker
  • Kubernetes
  • RESTful APIs
  • Backend integration

Salary (Rate): undetermined

City: Knutsford

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Job Title: Data Engineer - (AWS/MLOps/Python/PySpark/Sagemaker/ECS/Gitlab/CI/CD/Banking/Fintech)

Location: Knutsford (Hybrid)

Job Description: We are seeking an experienced Data Engineer with strong expertise in AWS, MLOps, and data pipeline development. The ideal candidate will have hands-on experience in deploying, monitoring, and maintaining machine learning models in cloud environments, as well as proficiency in big data ecosystems and CI/CD pipelines.

Key Responsibilities:

  • Design, develop, and optimize data pipelines to support AI/ML workloads.
  • Build and manage solutions using AWS services including ECS and Sagemaker.
  • Implement MLOps practices with tools such as MLflow, Airflow, Docker, and Kubernetes.
  • Collaborate with data scientists and engineers to streamline the machine learning lifecycle.
  • Integrate backend services via RESTful APIs and support front-end frameworks (HTML, Streamlit, Flask).
  • Ensure CI/CD best practices using GitLab, Jenkins , and related tools.

Key Skills:

  • Primary Skills: AWS Data Engineering, ML Engineering, MLOps, ECS, Sagemaker, GitLab, Jenkins, CI/CD, AI Lifecycle, Front-end (HTML, Streamlit, Flask), Cloud model deployment/monitoring
  • Technical Skills: Python, PySpark, Big Data ecosystems
  • MLOps Tools: MLflow, Airflow, Docker, Kubernetes
  • Secondary Skills: RESTful APIs, Backend integration