£700 Per day
Outside
Remote
United Kingdom
Summary: A leading Databricks Partner is seeking multiple Resident Solutions Architects for fully remote contracts outside IR35. The role involves leading the design, build, and deployment of scalable data and AI solutions on the Databricks platform, focusing on impactful customer engagements. Candidates should have extensive experience in Data Engineering and a strong background in Databricks implementations. The position requires technical leadership and collaboration with various stakeholders to ensure successful project delivery.
Key Responsibilities:
- Deliver impactful customer technical projects including reference architectures, solution accelerators, and production-ready deployments.
- Lead end-to-end design, implementation, and optimisation of Databricks-based data and AI platforms.
- Scope professional services engagements in collaboration with engagement managers and client stakeholders.
- Provide technical leadership in Databricks implementations, migrations, and integrations with client systems.
- Offer escalated technical support for operational issues, ensuring rapid resolution.
- Collaborate closely with Databricks engineering, project managers, and client teams to meet delivery objectives.
- Produce high-quality documentation, run workshops, and deliver technical training.
Key Skills:
- Databricks Certified at Professional or Champion level.
- Proven track record delivering multiple Databricks projects in client-facing environments.
- 7–10+ years’ experience in Data Engineering, Data Platforms, and Analytics Consulting.
- Deep expertise in Apache Spark™ and PySpark (knowledge of Spark runtime internals desirable).
- Strong background in performance tuning and optimisation for scalability.
- CI/CD for production deployments; working knowledge of MLOps.
- Proficiency in Python or Scala.
- Experience across at least one major cloud ecosystem (AWS, Azure, GCP), with working knowledge of two or more.
- Ability to design and deploy performant end-to-end data architectures.
- Strong consulting skills – confident working with clients, managing scope, and handling conflicts.
- Excellent documentation and whiteboarding skills.
Salary (Rate): £700 daily
City: undetermined
Country: United Kingdom
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Contract | Outside IR35 Fully Remote | £600–£700 per day
A leading Databricks Partner is seeking multiple Resident Solutions Architects (and Senior Data Engineers) for immediate outside IR35, fully remote contracts. You will work on cutting edge customer engagements, solving big data challenges using the Databricks platform. As a trusted consultant, you’ll lead the design, build, and deployment of scalable data and AI solutions, ensuring clients unlock maximum value from their data.
Key Responsibilities:
- Deliver impactful customer technical projects including reference architectures, solution accelerators, and production-ready deployments.
- Lead end-to-end design, implementation, and optimisation of Databricks-based data and AI platforms.
- Scope professional services engagements in collaboration with engagement managers and client stakeholders.
- Provide technical leadership in Databricks implementations, migrations, and integrations with client systems.
- Offer escalated technical support for operational issues, ensuring rapid resolution.
- Collaborate closely with Databricks engineering, project managers, and client teams to meet delivery objectives.
- Produce high-quality documentation, run workshops, and deliver technical training.
Required Skills & Experience:
- Databricks Certified at Professional or Champion level.
- Proven track record delivering multiple Databricks projects in client-facing environments.
- 7–10+ years’ experience in Data Engineering, Data Platforms, and Analytics Consulting.
- Deep expertise in Apache Spark™ and PySpark (knowledge of Spark runtime internals desirable).
- Strong background in performance tuning and optimisation for scalability.
- CI/CD for production deployments; working knowledge of MLOps.
- Proficiency in Python or Scala.
- Experience across at least one major cloud ecosystem (AWS, Azure, GCP), with working knowledge of two or more.
- Ability to design and deploy performant end-to-end data architectures.
- Strong consulting skills – confident working with clients, managing scope, and handling conflicts.
- Excellent documentation and whiteboarding skills