Kafka Architect- Cloudera

Kafka Architect- Cloudera

Posted 4 days ago by Vallum Associates

£75 Per hour
Undetermined
Hybrid
Leeds, England, United Kingdom

Summary: The Kafka Architect role focuses on designing and implementing scalable data streaming solutions using Apache Kafka on Cloudera. The position requires expertise in distributed systems and real-time data processing, with responsibilities including the setup and optimization of Kafka clusters. The ideal candidate will also provide technical leadership and collaborate with various teams to meet real-time data needs. This role is based in Leeds and involves a hybrid working arrangement.

Key Responsibilities:

  • Design and implement scalable Kafka-based architectures using open-source Kafka on Cloudera on-premises infrastructure.
  • Lead the setup, configuration, and optimization of Kafka clusters.
  • Define standards and best practices for Kafka producers, consumers, and streaming applications.
  • Integrate Kafka with various data sources, storage systems, and enterprise applications.
  • Monitor Kafka performance and ensure high availability, fault tolerance, and data security.
  • Collaborate with DevOps, Data Engineering, and Application teams to support real-time data needs.
  • Automate deployment and configuration using tools like Ansible, Terraform, or Cloudera Manager.
  • Provide technical leadership and mentorship to junior team members.

Key Skills:

  • Strong hands-on experience with Apache Kafka (including Kafka Connect, Kafka Streams).
  • Experience with Cloudera distribution for Kafka on on-premises environments.
  • Proficiency in designing high-volume, low-latency data pipelines.
  • Solid knowledge of Kafka internals – topics, partitions, consumer groups, offset management, etc.
  • Experience with data serialization formats like Avro, JSON, Protobuf.
  • Proficient in Java, Scala, or Python for Kafka-based development.
  • Familiarity with monitoring tools (Prometheus, Grafana, Confluent Control Center, etc.).
  • Understanding of networking, security (SSL/SASL), and data governance.
  • Experience with CI/CD pipelines and containerization (Docker, Kubernetes) is a plus.

Salary (Rate): £75.00/hr

City: Leeds

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Job Title: Kafka Architect- Cloudera

Location: Leeds (2days/week Onsite)

Duration: 06+ Months

Job Summary: We are looking for an experienced Kafka Architect to design and implement scalable, high-throughput data streaming solutions using Apache Kafka on Cloudera (on-premises). The ideal candidate will have a strong background in distributed systems, data pipelines, and real-time data processing.

Key Responsibilities:

  • Design and implement scalable Kafka-based architectures using open-source
  • Kafka on Cloudera on-premises infrastructure.
  • Lead the setup, configuration, and optimization of Kafka clusters.
  • Define standards and best practices for Kafka producers, consumers, and streaming applications.
  • Integrate Kafka with various data sources, storage systems, and enterprise applications.
  • Monitor Kafka performance and ensure high availability, fault tolerance, and data security.
  • Collaborate with DevOps, Data Engineering, and Application teams to support real-time data needs.
  • Automate deployment and configuration using tools like Ansible, Terraform, or Cloudera Manager.
  • Provide technical leadership and mentorship to junior team members.

Required Skills:

  • Strong hands-on experience with Apache Kafka (including Kafka Connect, Kafka Streams).
  • Experience with Cloudera distribution for Kafka on on-premises environments.
  • Proficiency in designing high-volume, low-latency data pipelines.
  • Solid knowledge of Kafka internals – topics, partitions, consumer groups, offset management, etc.
  • Experience with data serialization formats like Avro, JSON, Protobuf.
  • Proficient in Java, Scala, or Python for Kafka-based development.
  • Familiarity with monitoring tools (Prometheus, Grafana, Confluent Control Center, etc.).
  • Understanding of networking, security (SSL/SASL), and data governance.
  • Experience with CI/CD pipelines and containerization (Docker, Kubernetes) is a plus.

Kind Regards -- Priyanka Sharma

Senior Delivery Consultant

Office: 02033759240

Email: psharma@vallumassociates.com