Data Engineer (Spark/Kubernetes) (Financial Services)

Data Engineer (Spark/Kubernetes) (Financial Services)

Posted 4 days ago by Jobserve

Negotiable
Undetermined
Undetermined
London

Summary: The role of Data Engineer focuses on replacing a Legacy ETL tool with a modern Apache Spark-based data platform within a financial services organization. This hands-on position emphasizes building and optimizing Spark jobs, with responsibilities including development work, sprint delivery, and stakeholder engagement. The ideal candidate will have strong experience in Spark development and containerized environments, particularly using Kubernetes. This position is suited for a mid-level engineer with a focus on performance and reliability in data processing.

Key Responsibilities:

  • Build and support Spark jobs with a focus on performance optimization.
  • Run Spark workloads in containerized environments using Kubernetes.
  • Engage in development work, sprint delivery, and demos.
  • Document processes and engage with stakeholders.
  • Collaborate within a small Agile delivery team.

Key Skills:

  • Strong hands-on experience with Apache Spark and PySpark development.
  • Experience working with containerized environments using Kubernetes.
  • Proficiency in programming with Python or Scala.
  • Exposure to Big Data technologies and distributed data processing.
  • Some experience with Java/Java Spring Boot for development.
  • Experience in an Ops way of working, including deployment of solutions.
  • Experience with OpenShift is highly desirable.
  • Familiarity with Agile methodologies (Scrum, sprints, demos).
  • Experience in financial services or professional services.

Salary (Rate): £700.00 Daily

City: London

Country: United Kingdom

Working Arrangements: undetermined

IR35 Status: undetermined

Seniority Level: Mid-Level

Industry: IT

Detailed Description From Employer:

Your new companyWorking for a renowned financial services organisation Your new roleWe are seeking a Data Engineer to support the replacement of a Legacy ETL tool with a modern Apache Spark based data platform. This is a hands-on engineering role focused on building and supporting Spark jobs, with an emphasis on performance, reliability, and scalability.The role is focused on building nonperformance Apache Spark jobs, with a strong emphasis on performance optimisation. You shall be running Spark workloads in containerised environments using Kubernetes and programming skillset in Python/Scala or Java is also a required skillset.The role sits within a small Agile delivery team of four engineers (two onshore and two in Shenzhen), working closely with a Senior Data Engineer. You will be responsible for development work, sprint delivery, demos, documentation, and stakeholder engagement. This position suits a mid-level engineer with strong Spark development experience rather than design, infrastructure, or management responsibilities.

What you'll need to succeed

  • Strong hands-on experience with Apache Spark - Writing and tuning Spark jobs/PySpark development experience.
  • Strong experience working in with containerised environments using Kubernetes.
  • Experience with programming in Python or Scala
  • Exposure to Big Data technologies and distributed data processing
  • Have some experience using Java/Java Spring boot for development.
  • Experienced in an Ops way of working, not pure development only - you know how to deploy solutions.
  • Experience with OpenShift would be highly desirable!
  • Experience working in an Agile way of working (Scrum, sprints, demos)
  • Financial services or professional services experience required.

What you'll get in returnFlexible working options available.What you need to do nowIf you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.

Hays Specialist Recruitment Limited acts as an employment agency for permanent recruitment and employment business for the supply of temporary workers. By applying for this job you accept the T&C's, Privacy Policy and Disclaimers which can be found on our website.