Data Streaming Engineer

Data Streaming Engineer

Posted Today by Norton Blake

£625 Per day
Undetermined
Remote
London, UK

Summary: The Data Streaming Engineer role focuses on designing and delivering scalable, event-driven solutions using Kafka and the Confluent ecosystem. The position requires expertise in real-time data streaming and collaboration with various technical teams to build robust data pipelines. The engineer will also be responsible for deploying and optimizing platforms on cloud services. This contract position is fully remote, allowing flexibility in work arrangements.

Key Responsibilities:

  • Build and maintain Real Time data pipelines for enterprise clients
  • Design scalable event-driven architectures
  • Collaborate with data engineers, architects, and DevOps teams to deliver robust solutions
  • Deploy and optimize platforms on AWS, GCP, or Azure
  • Drive best practices for performance, security, and scalability

Key Skills:

  • 5+ years of hands-on experience with Apache Kafka (Confluent, AWS MSK, Cloudera, etc.)
  • Strong programming skills in Java, Python, or Scala
  • Experience deploying Kafka on cloud platforms (AWS, GCP, Azure)
  • Familiarity with Docker, Kubernetes, and CI/CD pipelines
  • Excellent problem-solving and communication skills

Salary (Rate): £625 per day

City: London

Country: UK

Working Arrangements: remote

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Data Streaming Engineer - Real Time Data Streaming

Location: Work from Home
Type: Contract

Are you passionate about Real Time data streaming and solving complex technical challenges?

We're looking for a Data Streaming Engineer to help design and deliver scalable, event-driven solutions using Kafka and the Confluent ecosystem.

What You'll Do

  • Build and maintain Real Time data pipelines for enterprise clients
  • Design scalable event-driven architectures
  • Collaborate with data engineers, architects, and DevOps teams to deliver robust solutions
  • Deploy and optimize platforms on AWS, GCP, or Azure
  • Drive best practices for performance, security, and scalability

What You'll Bring

  • 5+ years of hands-on experience with Apache Kafka (Confluent, AWS MSK, Cloudera, etc.)
  • Strong programming skills in Java, Python, or Scala
  • Experience deploying Kafka on cloud platforms (AWS, GCP, Azure)
  • Familiarity with Docker, Kubernetes, and CI/CD pipelines
  • Excellent problem-solving and communication skills

Bonus Points If You Have

  • Experience with Kafka Connect, Kafka Streams, KSQL, Schema Registry, or REST Proxy
  • Hands-on work with Confluent Cloud and tools like Flink or ksqlDB
  • Confluent certifications (Developer, Administrator, Flink Developer)
  • Knowledge of stream governance, data mesh architectures, or multi-cloud deployments
  • Experience with monitoring tools (Prometheus, Grafana, Splunk)