AWS Data Engineer (DBT, Python, Airflow)

AWS Data Engineer (DBT, Python, Airflow)

Posted 1 week ago by Robert Walters on JobServe

£500 Per day
Outside
Onsite
Leeds, UK
p>Contract AWS Data Engineer (DBT, Python, Airflow)

450 - 500 a day,

Outside IR35

Leeds (onsite 2-3 days p/w)

6-Month

As a Contract AWS Data Engineer, you'll play a vital role in our data engineering initiatives. Your mastery of DBT, Airflow, and Python will be pivotal in constructing efficient data pipelines that enable data-driven decision-making. Experience of working in Greenfield projects is highly desired.

AWS Data Engineer (DBT, Python, Airflow) Role and Responsibilities:

  • Design, build, and maintain data pipelines using DBT, Airflow, and Python.
  • AWS based platform building and optimisation.
  • Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions.
  • Develop ETL processes to clean, transform, and load data from diverse sources.
  • Ensure data quality, reliability, and performance across the pipeline.

Desired Skills and Experience

The ideal AWS Data Engineer will have a background in the following:

  • Expertise in DBT (Data Build Tool) for transforming and modelling data.
  • Familiarity with Apache Airflow for orchestrating complex data workflows.
  • Proven experience as a AWS Data Engineer, with a strong portfolio of successful data pipeline projects.
  • Proficiency in Python programming for Scripting and data manipulation.
  • Strong understanding of database concepts, ETL processes, and data warehousing.

If interested AWS Data Engineer (DBT, Python, Airflow) role please apply with your latest CV for immediate consideration.

No sponsorship offered and applicants must be based in Leeds office 2-3 days per week.

Robert Walters Operations Limited is an employment business and employment agency and welcomes applications from all candidates