Data Streaming Engineer

Data Streaming Engineer

Posted 2 weeks ago by 1755934172

£625 Per day
Undetermined
Remote
London

Summary: The Data Streaming Engineer role focuses on designing and delivering scalable, event-driven solutions utilizing Kafka and the Confluent ecosystem. This contract position requires expertise in real-time data streaming and collaboration with various technical teams. The engineer will be responsible for building data pipelines and optimizing platforms on cloud services. Strong programming skills and hands-on experience with Kafka are essential for success in this role.

Key Responsibilities:

  • Build and maintain real-time data pipelines for enterprise clients
  • Design scalable event-driven architectures
  • Collaborate with data engineers, architects, and DevOps teams to deliver robust solutions
  • Deploy and optimize platforms on AWS, GCP, or Azure
  • Drive best practices for performance, security, and scalability

Key Skills:

  • 5+ years of hands-on experience with Apache Kafka (Confluent, AWS MSK, Cloudera, etc.)
  • Strong programming skills in Java, Python, or Scala
  • Experience deploying Kafka on cloud platforms (AWS, GCP, Azure)
  • Familiarity with Docker, Kubernetes, and CI/CD pipelines
  • Excellent problem-solving and communication skills

Salary (Rate): £625 per day

City: London

Country: United Kingdom

Working Arrangements: remote

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

London

Data Streaming Engineer

Contract / £550 - £625 per day

Data Streaming Engineer – Real-Time Data Streaming

Location: Work from Home
Type: Contract

Are you passionate about real-time data streaming and solving complex technical challenges?

We’re looking for a Data Streaming Engineer to help design and deliver scalable, event-driven solutions using Kafka and the Confluent ecosystem.


What You’ll Do

  • Build and maintain real-time data pipelines for enterprise clients
  • Design scalable event-driven architectures
  • Collaborate with data engineers, architects, and DevOps teams to deliver robust solutions
  • Deploy and optimize platforms on AWS, GCP, or Azure
  • Drive best practices for performance, security, and scalability

What You’ll Bring

  • 5+ years of hands-on experience with Apache Kafka (Confluent, AWS MSK, Cloudera, etc.)
  • Strong programming skills in Java, Python, or Scala
  • Experience deploying Kafka on cloud platforms (AWS, GCP, Azure)
  • Familiarity with Docker, Kubernetes, and CI/CD pipelines
  • Excellent problem-solving and communication skills

Bonus Points If You Have

  • Experience with Kafka Connect, Kafka Streams, KSQL, Schema Registry, or REST Proxy
  • Hands-on work with Confluent Cloud and tools like Flink or ksqlDB
  • Confluent certifications (Developer, Administrator, Flink Developer)
  • Knowledge of stream governance, data mesh architectures, or multi-cloud deployments
  • Experience with monitoring tools (Prometheus, Grafana, Splunk)