£550 Per day
Outside
Remote
London, UK
Summary: Data Scientist position available for a rapidly growing FinTech organization, offering a fully remote working arrangement for UK-based candidates. The role involves applying advanced analytics and data engineering techniques to support critical systems and regulatory demands. Candidates should have extensive experience with cloud technologies and big data solutions. This opportunity is ideal for skilled professionals looking to make a significant impact in the FinTech sector.
Key Responsibilities:
- Work on high-impact projects supporting critical systems in the FinTech sector.
- Apply advanced analytics and data engineering techniques to transform complex datasets into actionable insights.
- Utilize the latest tools across cloud and big data ecosystems.
Key Skills:
- 5+ years' experience in PySpark.
- 4+ years of building scalable data solutions.
- 2+ years working in a cloud environment.
- Experience with ETL/ELT tools such as Apache Spark, Airflow, Python Pandas, Google DataProc, Google Composer, AWS EMR, AWS Glue.
- Familiarity with streaming technologies like Apache Kafka, AWS Kinesis, GCP Pub/Sub.
- Experience with data warehousing/lakehouse solutions such as Snowflake, Starburst, Databricks, AWS Redshift/Athena/Glue, GCP BigQuery.
- Knowledge of serverless frameworks like AWS Lambda/Step Functions, GCP Cloud Functions, or Azure Functions.
- Strong coding skills in Python and SQL.
- Bonus points for experience with Apache Beam.
Salary (Rate): £550 daily
City: London
Country: UK
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: Mid-Level
Industry: Finance
Data Scientist required for fast growing FinTech organisation.
Location: Fully Remote (UK-based)
Sector: FinTech
Day Rate: £500-£550 p/d - Outside IR35
Duration: 6 months+
The Opportunity:
Are you a skilled Data Scientist with a passion for cloud technologies and big data? Join a high-performing consultancy delivering cutting-edge data solutions to clients in the FinTech sector.
Your Role:
As a Data Scientist, you'll work on high-impact projects supporting critical systems and helping clients meet evolving regulatory demands. You'll apply advanced analytics and data engineering techniques to transform complex datasets into actionable insights, using the latest tools across cloud and big data ecosystems.
What You'll Bring:
5+ years' experience in PySpark
4+ years of building scalable data solutions
2+ years working in a cloud environment
Technical Expertise:
- ETL/ELT tools: experience with at least two of the following - Apache Spark, Airflow, Python Pandas, Google DataProc, Google Composer, AWS EMR, AWS Glue
- Streaming technologies: one or more - Apache Kafka, AWS Kinesis, GCP Pub/Sub
- Data warehousing/lakehouse: one or more - Snowflake, Starburst, Databricks, AWS Redshift/Athena/Glue, GCP BigQuery
- Serverless frameworks: AWS Lambda/Step Functions, GCP Cloud Functions, or Azure Functions
- Strong coding skills in Python and SQL
- Bonus points for experience with Apache Beam
If this role is of interest to you please send your CV to Richard Burton.
