£700 Per day
Undetermined
Remote
England, United Kingdom
Summary: This role focuses on enhancing Platform Excellence within Databricks for a UK-based client through a fully remote contract. The position involves promoting best practices by developing reusable patterns, Python SDKs, and frameworks to standardize workflows across teams. The successful candidate will bridge software engineering, data engineering, and MLOps to improve the overall developer experience and system performance. Key responsibilities include designing Python packages, creating frameworks, and collaborating with stakeholders to implement practical solutions.
Key Responsibilities:
- Design and build reusable Python packages/SDKs to accelerate delivery across teams
- Create best-practice frameworks and reference implementations (patterns, templates, guardrails)
- Drive improvements across the Databricks estate (e.g., jobs/workflows, cluster optimisation, Unity Catalog, MLflow)
- Partner closely with stakeholders to understand pain points and turn them into practical platform solutions
- Produce clear documentation so engineering and data science teams can self-serve and adopt consistently
Key Skills:
- Advanced Python (packaging, modular design, dependency management)
- Proven experience building reusable libraries/internal SDKs
- Strong Databricks experience (jobs/workflows, MLflow, Unity Catalog, optimisation)
- Strong software & platform engineering mindset (API/system design, scalability, performance)
- Strong MLOps knowledge (lifecycle management, reproducibility, monitoring)
- Solid data engineering fundamentals (Spark, distributed computing, pipelines)
- Strong documentation and developer enablement focus
- Nice to have Machine Learning / Deep Learning / GenAI exposure
Salary (Rate): £700.00/daily
City: undetermined
Country: United Kingdom
Working Arrangements: remote
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
I'm working with my client on a UK-based, fully remote contract opportunity focused on Platform Excellence within Databricks . This role is all about promoting best-practice use of Databricks across the organisation by building reusable patterns, Python SDKs/libraries, and pragmatic frameworks that teams can adopt quickly and consistently. The opportunity You'll sit at the intersection of software engineering, data engineering, and MLOps , helping teams standardise how they build, run, and govern workloads on Databricks - improving developer experience, reliability, scalability, and performance .
Key responsibilities
- Design and build reusable Python packages/SDKs to accelerate delivery across teams
- Create best-practice frameworks and reference implementations (patterns, templates, guardrails)
- Drive improvements across the Databricks estate (e.g., jobs/workflows , cluster optimisation , Unity Catalog , MLflow )
- Partner closely with stakeholders to understand pain points and turn them into practical platform solutions
- Produce clear documentation so engineering and data science teams can self-serve and adopt consistently
Priority skills / experience (what my client is looking for)
- Advanced Python (packaging, modular design, dependency management)
- Proven experience building reusable libraries / internal SDKs
- Strong Databricks experience (jobs/workflows, MLflow, Unity Catalog, optimisation)
- Strong software & platform engineering mindset (API/system design, scalability, performance)
- Strong MLOps knowledge (lifecycle management, reproducibility, monitoring)
- Solid data engineering fundamentals (Spark, distributed computing, pipelines)
- Strong documentation and developer enablement focus
Nice to have Machine Learning / Deep Learning / GenAI exposure
What makes someone successful here
- Excellent communication and stakeholder engagement
- Self-starter who works well with minimal direction
- Pragmatic delivery mindset (balancing best practice with business reality)
- Patient, coaching-oriented approach and strong problem-solving skills