Data Engineer - Telecom

Data Engineer - Telecom

Posted 1 week ago by Response Informatics

Negotiable
Undetermined
Undetermined
London Area, United Kingdom

Summary: The Data Engineer role in the Telecom sector focuses on designing and implementing real-time data pipelines, ensuring data quality and observability, and optimizing streaming applications. The position requires collaboration with backend teams to integrate microservices and involves the use of various data engineering tools and frameworks. Candidates should possess strong programming skills and experience with cloud platforms. This role is critical for maintaining efficient data flows and infrastructure in a dynamic environment.

Key Responsibilities:

  • Design and implement real-time data pipelines using tools like Apache Kafka, Apache Flink, or Spark Streaming.
  • Develop and maintain event schemas using Avro, Protobuf, or JSON Schema.
  • Collaborate with backend teams to integrate event-driven microservices.
  • Ensure data quality, lineage, and observability across streaming systems.
  • Optimize performance and scalability of streaming applications.
  • Implement CI/CD pipelines for data infrastructure.
  • Monitor and troubleshoot production data flows and streaming jobs.

Key Skills:

  • 3+ years of experience in data engineering or backend development.
  • Strong programming skills in Python, Java, or Scala.
  • Hands-on experience with Kafka, Kinesis, or similar messaging systems.
  • Familiarity with stream processing frameworks like Flink, Kafka Streams, or Spark Structured Streaming.
  • Solid understanding of event-driven design patterns (e.g., event sourcing, CQRS).
  • Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools.
  • Knowledge of data modeling, schema evolution, and serialization formats.

Salary (Rate): undetermined

City: London Area

Country: United Kingdom

Working Arrangements: undetermined

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Key Responsibilities

  • Design and implement real-time data pipelines using tools like Apache Kafka, Apache Flink, or Spark Streaming.
  • Develop and maintain event schemas using Avro, Protobuf, or JSON Schema.
  • Collaborate with backend teams to integrate event-driven microservices.
  • Ensure data quality, lineage, and observability across streaming systems.
  • Optimize performance and scalability of streaming applications.
  • Implement CI/CD pipelines for data infrastructure.
  • Monitor and troubleshoot production data flows and streaming jobs.

Required Skills & Qualifications

  • 3+ years of experience in data engineering or backend development.
  • Strong programming skills in Python, Java, or Scala.
  • Hands-on experience with Kafka, Kinesis, or similar messaging systems.
  • Familiarity with stream processing frameworks like Flink, Kafka Streams, or Spark Structured Streaming.
  • Solid understanding of event-driven design patterns (e.g., event sourcing, CQRS).
  • Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools.
  • Knowledge of data modeling, schema evolution, and serialization formats