Kafka Engineer

Kafka Engineer

Posted 1 week ago by 1753276155

Negotiable
Outside
Remote
USA

Summary: The role of Kafka Streaming Engineer involves designing, implementing, and maintaining real-time streaming solutions using Confluent Kafka, Terraform, and DevOps pipelines. The engineer will work closely with DevOps and infrastructure teams to stream data from SQL Server to Snowflake, ensuring efficient data delivery and infrastructure management. Strong experience with Azure DevOps and infrastructure as code is essential for this position. The role offers flexibility with remote working arrangements.

Key Responsibilities:

  • Build and support real-time data pipelines using Confluent Kafka and Kafka Connect.
  • Configure SQL Server CDC and stream change data to Kafka.
  • Design and deploy Kafka source and sink connectors, especially for SQL Server and Snowflake.
  • Use Terraform to provision and manage Kafka infrastructure including:
  • Topics
  • Connectors
  • ACLs
  • Schema Registry configurations
  • Automate deployments and topic creation using Azure DevOps (TFS) pipelines.
  • Collaborate with DevOps and data engineering teams to ensure end-to-end data delivery.
  • Monitor pipeline performance and troubleshoot issues as needed.
  • Document architecture, configuration, and deployment workflows.

Key Skills:

  • Strong hands-on experience with Apache Kafka / Confluent Kafka ecosystem.
  • Expertise in Kafka Connect, Kafka Streams, or ksqlDB.
  • Practical experience in streaming data from SQL Server using CDC.
  • Experience integrating Kafka with Snowflake using connectors or Snowpipe Streaming.
  • Proficiency in Terraform to manage Kafka infrastructure.
  • Working knowledge of CI/CD pipelines using Azure DevOps (TFS).
  • Familiar with Confluent Schema Registry and serialization formats (Avro, JSON).
  • Good scripting/programming experience with Python, Java, or Scala.
  • Experience in cross-team collaboration (Data/DevOps/Cloud).

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

We are looking for a hands-on Kafka Streaming Engineer with expertise in Confluent Kafka, Terraform, and DevOps pipelines to design, implement, and maintain real-time streaming solutions. The ideal candidate will help stream data from SQL Server (using CDC) to Snowflake, working closely with DevOps and infrastructure teams. The role offers flexibility to work either onshore , and requires strong experience with TFS (Azure DevOps) and infrastructure as code (IaC) using Terraform.

Key Responsibilities
Build and support real-time data pipelines using Confluent Kafka and Kafka Connect.
Configure SQL Server CDC and stream change data to Kafka.
Design and deploy Kafka source and sink connectors, especially for SQL Server and Snowflake.
Use Terraform to provision and manage Kafka infrastructure including:
Topics
Connectors
ACLs
Schema Registry configurations
Automate deployments and topic creation using Azure DevOps (TFS) pipelines.
Collaborate with DevOps and data engineering teams to ensure end-to-end data delivery.
Monitor pipeline performance and troubleshoot issues as needed.
Document architecture, configuration, and deployment workflows.



Required Skills & Qualifications
Strong hands-on experience with Apache Kafka / Confluent Kafka ecosystem.
Expertise in Kafka Connect, Kafka Streams, or ksqlDB.
Practical experience in streaming data from SQL Server using CDC.
Experience integrating Kafka with Snowflake using connectors or Snowpipe Streaming.
Proficiency in Terraform to manage Kafka infrastructure.
Working knowledge of CI/CD pipelines using Azure DevOps (TFS).
Familiar with Confluent Schema Registry and serialization formats (Avro, JSON).
Good scripting/programming experience with Python, Java, or Scala.
Experience in cross-team collaboration (Data/DevOps/Cloud).



Preferred Qualifications
Confluent Certified Developer or Terraform Associate Certification.
Familiarity with Snowpipe Streaming, Kafka REST Proxy, or custom connector development.
Understanding of Kafka security: SSL, SASL, RBAC.
Exposure to cloud-native Kafka solutions (e.g., Confluent Cloud on AWS/Azure).