Hadoop Developer

Hadoop Developer

Posted 1 day ago by N Consulting Global

Negotiable
Undetermined
Hybrid
London Area, United Kingdom

Summary: The role of Apache Flink Engineer involves designing and implementing stateful stream-processing jobs using Apache Flink, primarily in Java, to handle user interaction events and produce near real-time outputs. The engineer will integrate Flink with various services such as Kafka and AWS, ensuring production readiness and supporting CI/CD practices. Additionally, the role includes mentoring team members and managing non-functional requirements for streaming applications. This position is based in London and operates in a hybrid work mode.

Key Responsibilities:

  • Design and implement stateful stream-processing jobs in Apache Flink (primarily Java).
  • Integrate Flink with Kafka/AWS MSK, S3, and the chosen online store (MongoDB/MongoDB Atlas).
  • Own streaming non-functional requirements: correctness, fault tolerance, checkpointing/recovery, backpressure handling, and performance tuning.
  • Define and implement production readiness: observability (metrics/logs/tracing), alerting, dashboards, and runbooks.
  • Support CI/CD and infrastructure-as-code practices to deploy and operate Flink workloads safely in AWS.
  • Enable the team via knowledge transfer (pairing, design/code reviews, documentation, and handover).

Key Skills:

  • Strong experience building and operating production Apache Flink applications.
  • Hands-on AWS experience operating streaming/data workloads.
  • Experience integrating Flink with Kafka (including AWS MSK) and cloud services/storage.
  • Highly proficient in Java and Python (production experience required).
  • Strong software engineering fundamentals: clean, maintainable code; pragmatic testing; effective code review.
  • Comfort mentoring/pairing with engineers to up-skill the team.

Salary (Rate): undetermined

City: London

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Role : Apache Flink Engineer

Location : London

Work Mode : Hybrid

Contract Role Responsibilities

  • Design and implement stateful stream-processing jobs in Apache Flink (primarily Java), to process user interaction events and produce near real-time outputs for downstream services.
  • Integrate Flink with Kafka/AWS MSK, S3 and the chosen online store (MongoDB/MongoDB Atlas).
  • Own streaming non-functional requirements: correctness, fault tolerance, checkpointing/recovery, backpressure handling, and performance tuning.
  • Define and implement production readiness: observability (metrics/logs/tracing), alerting, dashboards and runbooks.
  • Support CI/CD and infrastructure-as-code practices to deploy and operate Flink workloads safely in AWS.
  • Enable the team via knowledge transfer (pairing, design/code reviews, documentation and handover).

Skills and experience

Essential

  • Strong experience building and operating production Apache Flink applications (event-time concepts, windows/watermarks, state, checkpointing and fault tolerance).
  • Hands-on AWS experience operating streaming/data workloads (IAM, networking/VPC basics, deployment, observability and incident troubleshooting).
  • Experience integrating Flink with Kafka (including AWS MSK) and cloud services/storage (e.g., S3).
  • Highly proficient in Java and Python (production experience required).
  • Strong software engineering fundamentals: clean, maintainable code; pragmatic testing; effective code review.
  • Comfort mentoring/pairing with engineers to up-skill the team.

Desirable

  • Experience with Amazon Managed Service for Apache Flink and/or running Flink on EKS/ECS.
  • Experience with MongoDB or MongoDB Atlas for low-latency, online-serving use cases.
  • Infrastructure-as-code (CloudFormation) and modern observability tooling (CloudWatch, Grafana, OpenTelemetry, Splunk).

Indicative technology stack

  • Java, Python
  • Apache Flink; Kafka / AWS MSK
  • AWS: IAM, VPC, S3, CloudWatch; plus a Flink runtime (managed service and/or EKS/ECS depending on solution)
  • MongoDB / MongoDB Atlas (online store)
  • CircleCI and IaC: GitHub; CloudFormation