Data Engineer (ETL, Azure, SQL, Python, Data Integration/Ingestion) - London and remote - 11 months+
Posted 3 days ago by Octopus Computer Associates
£350 Per day
Inside
Hybrid
London and remote, UK
Summary: The Data Engineer role focuses on designing, building, and maintaining data pipelines and workflows using the Databricks platform, with a strong emphasis on data integration and ingestion. The position requires a hands-on approach to data analysis and collaboration with various stakeholders to ensure reliable data sets. Candidates must be eligible for BPSS and work through PAYE via an umbrella company, with a hybrid working arrangement of 2-3 days onsite in London. The contract duration is expected to be over 11 months, with a daily rate of £350 inside IR35.
Key Responsibilities:
- Design, build and maintain data pipelines and ELT workflows on Databricks platform with Medalion architecture.
- Analyze data requirements and apply data modeling and quality techniques.
- Collaborate with data architects, analysts, product managers, and testers to deliver reliable data sets.
- Document data flows, transformation logic, and processes for knowledge sharing.
- Participate in agile teams, code reviews, solution design, and platform evolution.
Key Skills:
- Extensive hands-on experience in SQL, Python, and ETL tooling (Informatica IICS, ADF, Databricks).
- Proven experience in integrating and transforming data within the insurance/reinsurance market.
- Experience with both on-prem and cloud databases (Oracle, SQL Server).
- Familiarity with Agile methodologies and tools (Jira, AzureDevOps).
- Understanding of data security challenges and relevant technologies.
- Advanced communication skills and teamwork abilities.
- Professional certifications in Databricks and Azure are highly desired.
Salary (Rate): £350 per day
City: London
Country: UK
Working Arrangements: hybrid
IR35 Status: inside IR35
Seniority Level: undetermined
Industry: IT
Data Engineer (ETL, Azure, SQL, Python, Data Integration/Ingestion) - London and remote - 11 months+/rate: £350 per day inside IR35
One of our Blue Chip Clients is urgently looking for a Data Engineer.
For this role you will need to be onsite in London 2-3 days per week.
Please find some details below:
CONTRACTOR MUST BE ELIGIBLE FOR BPSS
MUST BE PAYE THROUGH UMBRELLA
Role Description:
Responsibilities
Strong hands on skills that can be leveraged directly in the deliverable and/or ensuring that their team is effectively working.
Design, build and maintain data pipelines and ELT workflows on Databricks platform with Medalion architecture
Analyses data requirements and provides data analysis techniques, and applies data modelling (including data vault) and data quality techniques to establish, modify or maintain data structures and their associated components in complex environments
Partner with data architects,data analysts, product manager, and testers to deliver reliable data sets.
Document data flows, transformation logic and processes for knowledge sharing and ongoing support.
Passionate about solving problems, enjoy connecting the dots between data, strategy and analytics, obsess with generating tangible benefits and high performance.
Collaborate in agile teams,participate in code reviews, solution design and platform evolution
Skills and Experience
Extensive hands on experience in SQL, Python, Data Integration/Ingestion and associated patterns - ETL tooling - Informatica IICS, ADF, Notebooks, Databricks, Delta Lake, Warehousing technologies and associated patterns, Cloud platforms - Azure preferred.
Proven experience in integrating, modelling and transforming Insurance domain data, ideally within insurance/reinsurance market.
Experience with on-prem and cloud versions of databases such as Oracle and SQL Server.
Experience with Agile delivery frameworks/methodologies (eg Scrum, SAFe) and tools (ie Jira, AzureDevOps).
Experience with mass ingestion capabilities and cloud process flows, data quality and master data management
Understanding data related security challenges and tooling with specific technologies (eg Databricks)
Experience and in-depth knowledge of data delivery and associated architecture
principles, data modelling concepts, and all steps of data production process
Advanced verbal and written communications skills, as well as active listening, along with teamwork.
Professional certifications in public cloud and tooling - Databricks and Azure are highly desired.
Please send CV for full details and immediate interviews. We are a preferred supplier to the client.