Lead Azure Databricks Data Engineer (Databricks, ADF, Python) – Reinsurance / Lloyd’s Market
Posted 1 day ago by GIOS Technology
Negotiable
Undetermined
Hybrid
Manchester, England, United Kingdom
Summary: The Lead Azure Databricks Engineer will be responsible for designing, developing, and optimizing enterprise-scale data platforms utilizing Azure Data Services and Databricks. This role involves building scalable data pipelines and collaborating with various teams to ensure compliance with Lloyd’s of London regulatory requirements. The engineer will also implement data governance and best practices while providing technical leadership and mentoring to other engineers.
Key Responsibilities:
- Lead the design, development, and optimization of enterprise-scale data platforms using Azure Data Services and Databricks.
- Build and manage scalable data pipelines using PySpark, Spark SQL, Delta Lake, and Lakehouse architecture.
- Collaborate with underwriting, actuarial, finance, risk, and reinsurance teams to deliver solutions aligned to Lloyd’s of London regulatory requirements.
- Implement data governance, lineage, security controls, and cloud best practices across the data platform.
- Drive CI/CD, automation, performance tuning, and cost optimization using Azure DevOps and cloud-native engineering patterns.
- Provide hands-on technical leadership, mentoring engineers, and resolving complex data engineering challenges end-to-end.
Key Skills:
- Azure
- Databricks
- PySpark
- SparkSQL
- Python
- DeltaLake
- AzureDataFactory
- AzureDataLake
- AzureFunctions
- KeyVault
- UnityCatalog
- Lakehouse
- MedallionArchitecture
- DataGovernance
- CI/CD
- AzureDevOps
- DataModelling
- CloudSecurity
- FinOps
- LloydsMarket
Salary (Rate): undetermined
City: Manchester
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
I am hiring for Lead Azure Databricks Engineer - (ADF/Data Lake/Key Vault/Azure Functions/Databricks/Python/PySpark/Reinsurance) Location: Manchester, UK (2 days onsite per week)
Job Description: Lead the design, development, and optimisation of enterprise-scale data platforms using Azure Data Services and Databricks . Build and manage scalable data pipelines using PySpark , Spark SQL , Delta Lake , and Lakehouse architecture . Collaborate with underwriting, actuarial, finance, risk, and reinsurance teams to deliver solutions aligned to Lloyd’s of London regulatory requirements. Implement data governance , lineage, security controls, and cloud best practices across the data platform. Drive CI/CD , automation, performance tuning, and cost optimisation using Azure DevOps and cloud-native engineering patterns. Provide hands-on technical leadership, mentoring engineers, and resolving complex data engineering challenges end-to-end.
Key Skills: Azure, Databricks, PySpark, SparkSQL, Python, DeltaLake, AzureDataFactory, AzureDataLake, AzureFunctions, KeyVault, UnityCatalog, Lakehouse, MedallionArchitecture, DataGovernance, CI/CD, AzureDevOps, DataModelling, CloudSecurity, FinOps, LloydsMarket