Python Engineer - Data Pipelines (UK Remote)

Python Engineer - Data Pipelines (UK Remote)

Posted 7 days ago by Robson Bale Ltd

Negotiable
Undetermined
Remote
London, UK

Summary: The role of Python Engineer focuses on developing and optimizing cloud-based data pipelines within a collaborative team environment. The engineer will work closely with the Product Owner and Technical Lead to ensure high performance and scalability of data systems. Key responsibilities include designing asynchronous workflows and automating data processes. The position requires extensive experience in Python development and data engineering practices.

Key Responsibilities:

  • Develop and maintain data ingestion and ETL pipelines using Python and tools like Airflow, Dagster, or Prefect
  • Design asynchronous workflows and event-driven architectures with Kafka, RabbitMQ, or similar
  • Optimize performance and reliability of large-scale data systems
  • Automate and monitor data workflows for scalability and observability
  • Collaborate with business and product teams to deliver robust, compliant data solutions

Key Skills:

  • 6+ years of Python development experience focused on data pipelines
  • Expertise with SQL/NoSQL, Airflow/Dagster/Prefect, and message brokers
  • Hands-on experience with Docker, Kubernetes, and AWS/GCP/Azure
  • Strong knowledge of OOP and Agile practices
  • Excellent communication and problem-solving skills
  • Experience in real estate or mortgage data (preferred)
  • Familiarity with ML integration and data privacy/compliance (a plus)

Salary (Rate): undetermined

City: London

Country: UK

Working Arrangements: remote

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

We're looking for an experienced Python Engineer to join a newly formed team working alongside a client's Product Owner, Technical Lead, and development team. You'll design, build, and optimize scalable, cloud-based data pipelines and ensure top performance in production environments.

Key Responsibilities:

  • Develop and maintain data ingestion and ETL pipelines using Python and tools like Airflow, Dagster, or Prefect
  • Design asynchronous workflows and event-driven architectures with Kafka, RabbitMQ, or similar
  • Optimize performance and reliability of large-scale data systems
  • Automate and monitor data workflows for scalability and observability
  • Collaborate with business and product teams to deliver robust, compliant data solutions

Requirements:

  • 6+ years of Python development experience focused on data pipelines
  • Expertise with SQL/NoSQL, Airflow/Dagster/Prefect, and message brokers
  • Hands-on experience with Docker, Kubernetes, and AWS/GCP/Azure
  • Strong knowledge of OOP and Agile practices
  • Excellent communication and problem-solving skills
  • Experience in real estate or mortgage data (preferred)
  • Familiarity with ML integration and data privacy/compliance (a plus)