Senior Data Engineer

Senior Data Engineer

Posted Today by ARC IT Recruitment

£500 Per day
Outside
Remote
United Kingdom

Summary: A Senior Data Engineer is sought for a fast-growing HealthTech/Tech for Good start-up, offering a 3-month contract with potential for extension. The role involves designing, building, and maintaining data pipelines while modernizing the data lakehouse and managing secure AWS cloud infrastructure. The position emphasizes data quality, security, and technical excellence. This is a remote role based in the UK with a competitive daily rate.

Key Responsibilities:

  • Architect and maintain data pipelines (streaming & batch)
  • Build and manage scalable data storage solutions (data lakehouse)
  • Design and maintain secure AWS cloud infrastructure
  • Implement and manage CI/CD and DevOps pipelines
  • Champion data quality, security, and best practices
  • Collaborate with cross-functional teams
  • Implement and manage MLOps capabilities

Key Skills:

  • Advanced Python programming skills
  • Expertise in data engineering tools and frameworks (Apache Flink)
  • Hands-on AWS experience (Serverless, CloudFormation, CDK)
  • Strong understanding of containerization, CI/CD, and DevOps
  • Modern data storage knowledge

Salary (Rate): £500/day

City: undetermined

Country: United Kingdom

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: Senior

Industry: IT

Detailed Description From Employer:

Senior Data Engineer

UK Remote, £500/day, Outside IR35

A Senior Data Engineer is required by fast-growing HealthTech/Tech for Good start-up. This is a 3-month contract initially with scope for extension.

In this role, you will design, build, and maintain robust data pipelines using streaming and batch processing technologies and help to lead the modernization of their data lakehouse.

You will implement and manage secure cloud infrastructure using AWS, championing a culture of data quality, security, and technical excellence.

Key Responsibilities:

  • Architect and maintain data pipelines (streaming & batch)
  • Build and manage scalable data storage solutions (data lakehouse)
  • Design and maintain secure AWS cloud infrastructure
  • Implement and manage CI/CD and DevOps pipelines
  • Champion data quality, security, and best practices
  • Collaborate with cross-functional teams
  • Implement and manage MLOps capabilities

Essential Skills:

  • Advanced Python programming skills
  • Expertise in data engineering tools and frameworks (Apache Flink)
  • Hands-on AWS experience (Serverless, CloudFormation, CDK)
  • Strong understanding of containerization, CI/CD, and DevOps
  • Modern data storage knowledge

Sound like you? Please get your CV over ASAP.