Senior Data Engineer/Developer - Python, SQL, Azure - Investment Management
Posted Today by Strike IT Services
£850 Per day
Inside
Hybrid
London/Hybrid, UK
Summary: The role of Senior Data Engineer involves working with a leading investment management organization on a proprietary data integration product. The position requires hands-on expertise in Python, SQL, Azure, and Databricks, focusing on building and optimizing ETL pipelines and data integration solutions. The role is hybrid, requiring three days in London, and is classified as inside IR35. Candidates should have extensive experience in enterprise data integration within the asset and investment management sector.
Key Responsibilities:
- Design, develop and optimise ETL pipelines using Python and SQL
- Build and maintain scalable data pipelines leveraging Azure and Databricks (Spark-based processing)
- Build new integrations into enterprise investment platforms (Aladdin, CRD, Simcorp Dimension, etc)
- Build and maintain data pipelines within Azure-based infrastructure
- Enhance and refactor existing data ingestion and transformation logic
- Perform detailed data mapping across complex investment data sets
- Transform and normalise portfolio, holdings, transaction, and reference data
- Troubleshoot cross-system data inconsistencies
- Improve performance, scalability and reliability of data workflows
- Work closely with Data BAs and investment stakeholders to interpret data models and requirements
Key Skills:
- Strong hands-on development experience in Python
- Advanced SQL development skills, including performance tuning and complex query optimisation
- Proven experience building and maintaining ETL/data integration pipelines
- Strong experience with Databricks, including Spark-based data processing and optimisation
- Experience developing within Microsoft Azure environments (Data Factory, storage, orchestration, or similar services)
- Strong understanding of data modelling and relational database design
- Experience working with investment management data sets such as:
- Portfolio & holdings data
- Transactions & cash movements
- Security master & instrument reference data
- Positions & valuations
- Benchmarks
- Client & mandate hierarchies
- Experience integrating with enterprise investment platforms (OMS, PMS, IBOR, ABOR or similar)
- Ability to interpret and map complex financial data structures
- Strong analytical and problem-solving skills
- Experience of working within Asset & Investment Management
Salary (Rate): £850 per day
City: London
Country: UK
Working Arrangements: hybrid
IR35 Status: inside IR35
Seniority Level: undetermined
Industry: IT
We are supporting a leading organisation within the investment management sector who are looking for a Senior Data Engineer to work on a proprietary investment data integration product.
The platform is an enterprise ETL and integration layer that connects to leading front-to-back investment management systems and delivers standardised, high-quality investment data into a modern cloud environment.
This role requires a highly hands-on engineer with strong Python, SQL, Azure and Databricks experience, alongside deep enterprise data integration expertise within Asset & Investment Management.
INSIDE IR35
HYBRID WORKING 3 DAYS IN LONDON
Responsibilities:
- Design, develop and optimise ETL pipelines using Python and SQL
- Build and maintain scalable data pipelines leveraging Azure and Databricks (Spark-based processing)
- Build new integrations into enterprise investment platforms (Aladdin, CRD, Simcorp Dimension, etc)
- Build and maintain data pipelines within Azure-based infrastructure
- Enhance and refactor existing data ingestion and transformation logic
- Perform detailed data mapping across complex investment data sets
- Transform and normalise portfolio, holdings, transaction, and reference data
- Troubleshoot cross-system data inconsistencies
- Improve performance, scalability and reliability of data workflows
- Work closely with Data BAs and investment stakeholders to interpret data models and requirements
Experience:
- Strong hands-on development experience in Python
- Advanced SQL development skills, including performance tuning and complex query optimisation
- Proven experience building and maintaining ETL/data integration pipelines
- Strong experience with Databricks, including Spark-based data processing and optimisation
- Experience developing within Microsoft Azure environments (Data Factory, storage, orchestration, or similar services)
- Strong understanding of data modelling and relational database design
- Experience working with investment management data sets such as:
- Portfolio & holdings data
- Transactions & cash movements
- Security master & instrument reference data
- Positions & valuations
- Benchmarks
- Client & mandate hierarchies
- Experience integrating with enterprise investment platforms (OMS, PMS, IBOR, ABOR or similar)
- Ability to interpret and map complex financial data structures
- Strong analytical and problem-solving skills
- Experience of working within Asset & Investment Management