AWS Data Engineer

AWS Data Engineer

Posted 1 day ago by Randstad Digital

£245 Per day
Inside
Hybrid
London Area, United Kingdom

Summary: The AWS Data Engineer role in London involves designing data models and building scalable ETL pipelines to drive data integration across complex systems. Candidates should have over two years of experience in data engineering and a relevant degree. The position emphasizes collaboration with analysts and developers to turn raw data into business value. This is a six-month contract position with a flexible working arrangement of 2-3 days in the office.

Key Responsibilities:

  • Design & implement data models and scalable ETL/ELT pipelines
  • Map data sources, codify business logic, and build data flows
  • Develop data quality solutions & explore new technologies
  • Collaborate with analysts, developers, and business stakeholders

Key Skills:

  • 2+ years in data engineering or related roles
  • Bachelor’s in CS, Engineering, Mathematics, Finance, etc.
  • Proficiency in Python, SQL, and one or more: R, Java, Scala
  • Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB)
  • Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi)
  • Bonus: experience with BI tools, API integrations, and graph databases

Salary (Rate): £245 daily

City: London

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: inside IR35

Seniority Level: Mid-Level

Industry: IT

Detailed Description From Employer:

We're Hiring: AWS Data Engineer

Location: London (2-3 days in office)

Experience : 2+ Years

Degree: STEM/Business

Rate: £225 to 245 a day Umbrella

Duration : 6 months contract

We're looking for a Data Engineer to help power our innovation engine. You’ll design data models, build scalable ETL pipelines, codify business logic, and drive data integration across complex systems—structured and unstructured alike. This is your chance to turn raw data into real business value using cutting-edge tech in a collaborative, forward-thinking team.

What You’ll Do:

  • Design & implement data models and scalable ETL/ELT pipelines
  • Map data sources, codify business logic, and build data flows
  • Develop data quality solutions & explore new technologies
  • Collaborate with analysts, developers, and business stakeholders

What You Bring:

  • 2+ years in data engineering or related roles
  • Bachelor’s in CS, Engineering, Mathematics, Finance, etc.
  • Proficiency in Python, SQL , and one or more: R, Java, Scala
  • Experience with relational/NoSQL databases (e.g., PostgreSQL, MongoDB)
  • Familiarity with big data tools (Hadoop, Spark, Kafka), cloud platforms (Azure, AWS, GCP), and workflow tools (Airflow, Luigi)
  • Bonus: experience with BI tools , API integrations , and graph databases

Why Join Us?

  • Work with large-scale, high-impact data
  • Solve real-world problems with a top-tier team
  • Flexible, fast-paced, and tech-forward environment

Apply now and help us build smarter, data-driven solutions.

#TechCareers #Innovation #Python #SQL #Spark #Kafka #Hadoop#DataEngineer #ETLDeveloper#BigDataEngineer#DataEngineering#AnalyticsJobs #HiringNow#JobOpening#Careers