£60 Per hour
Outside
Remote
United Kingdom
Summary: The Azure Databricks Specialist role is a 6-month contract focused on modernizing and integrating data environments for a leading organization undergoing a significant data transformation. The position requires expertise in Azure Databricks, Salesforce integrations, and data pipeline optimization. The role is fully remote and aims to enhance data workflows and replace legacy systems. Candidates should have a strong background in enterprise-level data solutions and integration challenges.
Key Responsibilities:
- Lead development and optimisation of Azure Databricks pipelines and workflows.
- Work with Azure Data Factory and Azure Data Streaming to ingest, transform, and deliver data in real time.
- Collaborate with developers to make C# modifications as part of integration improvements.
- Rebuild and enhance data flows between Salesforce and other systems.
- Design and implement a new integration layer to replace old estate connections.
Key Skills:
- Proven experience as a Databricks Specialist in enterprise environments.
- Strong knowledge of Azure Data Factory and Azure Data Streaming.
- Solid C# skills for making application-level changes where required.
- Experience with Salesforce integrations (revamp, rebuild, optimisation).
- Previous involvement in building integration layers and migrating from legacy systems.
Salary (Rate): £60.00/hr
City: undetermined
Country: United Kingdom
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Role: Azure Databricks Specialist (Contract)
Day Rate: £400-£450 (Outside IR35)
Duration: 6 months
Location: Fully Remote (UK-based candidates only)
Start Date: ASAP
We’re working with a leading organisation undergoing a major data transformation programme. They’re looking for a Azure Databricks Specialist to join on a 6-month fully remote contract , helping to modernise and integrate their data environment. This role will focus on Databricks, revamping Salesforce integrations, modernising data pipelines, and building a new integration layer to replace legacy systems.
Key Responsibilities:
- Lead development and optimisation of Azure Databricks pipelines and workflows.
- Work with Azure Data Factory and Azure Data Streaming to ingest, transform, and deliver data in real time.
- Collaborate with developers to make C# modifications as part of integration improvements.
- Rebuild and enhance data flows between Salesforce and other systems.
- Design and implement a new integration layer to replace old estate connections.
Key Skills & Experience:
- Proven experience as a Databricks Specialist in enterprise environments.
- Strong knowledge of Azure Data Factory and Azure Data Streaming .
- Solid C# skills for making application-level changes where required.
- Experience with Salesforce integrations (revamp, rebuild, optimisation).
- Previous involvement in building integration layers and migrating from legacy systems.
If you’re a Databricks expert who thrives on complex integration challenges and wants to work on a high-impact transformation project, apply now and we’ll be in touch if you have been selected for shortlisting.