£699 Per day
Inside
Remote
London, UK
Summary: The Data Scientist role involves driving data quality and supporting cross-functional decision-making through the development of robust data pipelines and self-serve data products. The position requires expertise in analytics and data engineering, with a focus on metrics related to platform health and ML/AI adoption. This is a fully remote contract position for six months, with an urgent hiring timeline. Candidates will be responsible for end-to-end data modeling and mentoring junior team members.
Key Responsibilities:
- Own end-to-end analytical data modelling in BigQuery using dbt.
- Build reliable data pipelines utilizing SQL, Python, and distributed processing frameworks (Apache Spark, Scio, Beam, or Flink).
- Develop clear, story-driven dashboards using tools like Looker or Tableau.
- Champion data quality, implement CI/CD for dbt models, and mentor junior team members.
Key Skills:
- 5+ years of experience in analytics or data engineering, with deep expertise in SQL.
- Extensive experience with dbt, a cloud data warehouse, and workflow orchestrators (Airflow, Dagster, Prefect, or Flyte).
- Proficiency in Python for data analysis and automation.
- Bonus: Experience with experimentation, ML/AI metrics, or platform productivity.
Salary (Rate): £699 daily
City: London
Country: UK
Working Arrangements: remote
IR35 Status: inside IR35
Seniority Level: undetermined
Industry: IT
Role: Data Scientist
Type: Contracts (6 months)
Location: London, UK (REMOTE)
Working Model: Fully Remote
Payrate:
412 - 512 GBP/day on PAYE
487 - 587 GBP/day on RUPAYE
599 - 699 GBP/day on Inside IR35 on Umbrella
The Role: Join our team to drive data quality and empower cross-functional decision-making by building robust pipelines and self-serve data products. You will be essential in defining metrics for platform health, developer productivity, and ML/AI adoption.
What You Will Do:
- Own end-to-end analytical data modelling in BigQuery using dbt.
- Build reliable data pipelines utilizing SQL, Python, and distributed processing frameworks (Apache Spark, Scio, Beam, or Flink).
- Develop clear, story-driven dashboards using tools like Looker or Tableau.
- Champion data quality, implement CI/CD for dbt models, and mentor junior team members.
What You Bring:
- 5+ years of experience in analytics or data engineering, with deep expertise in SQL.
- Extensive experience with dbt, a cloud data warehouse, and workflow orchestrators (Airflow, Dagster, Prefect, or Flyte).
- Proficiency in Python for data analysis and automation.
- Bonus: Experience with experimentation, ML/AI metrics, or platform productivity.
This is an urgent vacancy with a deadline where the hiring manager is shortlisting for an interview immediately. Please apply with a copy of your CV
Randstad Technologies is acting as an Employment Business in relation to this vacancy.