£60 Per hour
Outside
Remote
United Kingdom
Summary: The role of Data Engineer (Azure Databricks) involves working on a major data transformation programme for a leading organization. The position is fully remote and focuses on modernizing data pipelines, revamping Salesforce integrations, and building a new integration layer. The contract is for 6 months, with an emphasis on leveraging Azure Databricks and related technologies. The ideal candidate will have proven expertise in Databricks and experience with enterprise-level data integration challenges.
Key Responsibilities:
- Lead development and optimisation of Azure Databricks pipelines and workflows.
- Work with Azure Data Factory and Azure Data Streaming to ingest, transform, and deliver data in real time.
- Collaborate with developers to make C# modifications as part of integration improvements.
- Rebuild and enhance data flows between Salesforce and other systems.
- Design and implement a new integration layer to replace old estate connections.
Key Skills:
- Proven experience as a Databricks Specialist in enterprise environments.
- Strong knowledge of Azure Data Factory and Azure Data Streaming.
- Solid C# skills for making application-level changes where required.
- Experience with Salesforce integrations (revamp, rebuild, optimisation).
- Previous involvement in building integration layers and migrating from legacy systems.
Salary (Rate): £60.00/hr
City: undetermined
Country: United Kingdom
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Role: Data Engineer (Azure Databricks) (Contract)
Day Rate: £400-£450 (Outside IR35)
Duration: 6 months
Location: Fully Remote (UK-based candidates only)
Start Date: ASAP
We’re working with a leading organisation undergoing a major data transformation programme. They’re looking for a Data Engineer who is a Databricks Specialist to join on a 6-month fully remote contract , helping to modernise and integrate their data environment. This role will focus on Databricks, revamping Salesforce integrations, modernising data pipelines, and building a new integration layer to replace legacy systems.
Key Responsibilities:
- Lead development and optimisation of Azure Databricks pipelines and workflows.
- Work with Azure Data Factory and Azure Data Streaming to ingest, transform, and deliver data in real time.
- Collaborate with developers to make C# modifications as part of integration improvements.
- Rebuild and enhance data flows between Salesforce and other systems.
- Design and implement a new integration layer to replace old estate connections.
Key Skills & Experience:
- Proven experience as a Databricks Specialist in enterprise environments.
- Strong knowledge of Azure Data Factory and Azure Data Streaming .
- Solid C# skills for making application-level changes where required.
- Experience with Salesforce integrations (revamp, rebuild, optimisation).
- Previous involvement in building integration layers and migrating from legacy systems.
If you’re a Databricks expert who thrives on complex integration challenges and wants to work on a high-impact transformation project, apply now and we’ll be in touch if you have been selected for shortlisting.