Senior Data Platform Engineer (Remote)

Senior Data Platform Engineer (Remote)

Posted 1 week ago by 1750835619

Negotiable
Undetermined
Remote
UK (Remote)

Summary: Required for a full-time, six-month, fully remote contract, a highly skilled Data Platform Engineer is needed to support a distributed streaming platform for real-time data. The ideal candidate will have expertise in Kafka, Kubernetes, and cloud platforms like Google Cloud Platform (GCP) or Microsoft Azure. This role involves designing, deploying, and optimizing scalable data pipelines in a cloud-native environment.

Key Responsibilities:

  • Design, deploy, and manage Kafka clusters in GCP or Azure environments
  • Operate and maintain Kafka on Kubernetes using Helm, Operators, or custom configurations
  • Collaborate with cross-functional teams in building data pipelines, integrating with Kafka and building microservices
  • Monitor, troubleshoot, and optimize Kafka performance and reliability
  • Automate infrastructure and deployment processes using Infrastructure-as-Code tools
  • Ensure security, compliance, and high availability of Kafka systems

Key Skills:

  • Strong hands-on experience with Apache Kafka in production environments
  • Proficiency with Kubernetes (GKE, AKS, or self-managed clusters)
  • Solid understanding of cloud infrastructure in GCP and/or Azure
  • Experience with CI/CD pipelines and DevOps practices
  • Strong programming skills in Python, Java, or similar languages
  • Familiarity with monitoring tools (e.g., Prometheus, Grafana) and logging systems
  • Excellent communication and documentation skills

Salary (Rate): undetermined

City: undetermined

Country: UK

Working Arrangements: remote

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Required for a fulltime, six moth, fully remote contract, a highly skilled Data Platform Engineer to support our distributed streaming platform for handling real-time streaming data. This role is ideal for someone with deep expertise in Kafka, Kubernetes and microservices, also cloud platforms like Google Cloud Platform (GCP) or Microsoft Azure. You’ll play a key role in designing, deploying, and optimizing scalable, real-time data pipelines in a cloud-native environment.

Key Responsibilities:

* Design, deploy, and manage Kafka clusters in GCP or Azure environments
* Operate and maintain Kafka on Kubernetes using Helm, Operators, or custom configurations
* Collaborate with cross-functional teams in building data pipelines, integrating with Kafka and building microservices
* Monitor, troubleshoot, and optimize Kafka performance and reliability
* Automate infrastructure and deployment processes using Infrastructure-as-Code tools
* Ensure security, compliance, and high availability of Kafka systems

Qualifications:

* Strong hands-on experience with Apache Kafka in production environments
* Proficiency with Kubernetes (GKE, AKS, or self-managed clusters)
* Solid understanding of cloud infrastructure in GCP and/or Azure
* Experience with CI/CD pipelines and DevOps practices
* Strong programming skills in Python, Java, or similar languages
* Familiarity with monitoring tools (e.g., Prometheus, Grafana) and logging systems
* Excellent communication and documentation skills

Preferred Qualifications:

* Experience with Confluent Platform or Confluent Cloud
* Knowledge of Terraform, Helm, or other IaC tools
* Background in stream processing (Kafka Streams, RabbitMQ Streams, Flink, RocksDB, etc.)
* Certifications in GCP, Azure, or Kubernetes (CKA/CKAD)
* Experience working in agile, cross-functional teams

To arrange a Teams-based interview, please email in the first instance, your CV to jt@waconsultants.com

WA Consultants is an Employment Business and an Employment Agency as described within The Conduct of Employment Agencies and Employment Businesses Regulations 2003.