£530 Per day
Outside
Remote
London, England, United Kingdom
Summary: The Data Engineer role involves designing, building, and maintaining ETL/ELT pipelines and workflows, with a focus on integrating data from various sources into Snowflake. The position requires strong collaboration with analysts and product managers to support data-driven decisions and ensure data quality. This is a remote position for candidates based in the UK, with an initial contract duration of 6 months. The role is classified as outside IR35.
Key Responsibilities:
- Design, build, and maintain ETL/ELT pipelines and batch/streaming workflows.
- Integrate data from external APIs and internal systems into Snowflake and downstream tools.
- Use web scraping/browser automation to extract data from platforms with UI-based data capabilities.
- Own critical parts of the Airflow-based orchestration layer and Kafka-based event streams.
- Ensure data quality, reliability, and observability across pipelines and platforms.
- Build shared data tools and frameworks to support analytics and reporting use cases.
- Partner closely with analysts, product managers, and other engineers to support data-driven decisions.
Key Skills:
- 3+ years of experience as a Data Engineer working on data infrastructure.
- Strong Python skills and hands-on experience with SQL.
- Experience with modern orchestration tools like Airflow.
- Experience with APIs and extracting data from APIs.
- Understanding of data modelling, governance, and performance tuning in warehouse environments.
- Comfort operating in a cloud-native environment like AWS.
- Terraform experience.
- Nice to have: Snowflake, web scraping via browser automation (e.g., playwright, selenium, puppeteer).
Salary (Rate): £530 daily
City: London
Country: United Kingdom
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Data Engineer – HIRING ASAP – Outside IR35
Start date: ASAP
Duration: 6 months initially with a view to extend
Location: Remote (Need to be UK Based)
Rate: £400 - £530 per day outside ir35
Responsibilities
- Design, build, and maintain ETL/ELT pipelines and batch/streaming workflows.
- Integrate data from external APIs and internal systems into Snowflake and downstream tools.
- Use web scraping / browser automation to pull data from platforms that only have UI based data extract capabilities (no APIs)
- Own critical parts of our Airflow-based orchestration layer and Kafka-based event streams.
- Ensure data quality, reliability, and observability across our pipelines and platforms.
- Build shared data tools and frameworks to support analytics and reporting use cases.
- Partner closely with analysts, product managers, and other engineers to support data-driven decisions.
Key Skills
- 3+ years of experience as a Data Engineer working on data infrastructure.
- Strong Python skills and hands-on experience with SQL
- Experience with modern orchestration tools like Airflow.
- Experience with APIs and extracting data from APIs.
- Understanding of data modelling, governance, and performance tuning in warehouse environments.
- Comfort operating in a cloud-native environment like AWS.
- Terraform experience.
- Nice to have Snowflake
- Web scraping via browser automation (playwright / selenium / puppeteer for example)