Negotiable
Undetermined
Hybrid
London Area, United Kingdom
Summary: The role of Senior Software Engineer – Data Engineer involves designing and building Data Vault 2.0 architecture on Snowflake, focusing on creating scalable data pipelines and data marts for analytics. The position requires collaboration with various stakeholders to deliver business-ready data products while optimizing performance and maintaining data governance. This hybrid role is ideal for a passionate Data Engineer with hands-on experience in relevant technologies.
Key Responsibilities:
- Design and build Data Vault 2.0 architecture on Snowflake
- Develop scalable data pipelines (ETL/ELT) integrating diverse sources (APIs, databases, streaming)
- Create data marts that power critical reporting and analytics
- Collaborate with analysts, engineers, and stakeholders to deliver business-ready data products
- Optimise pipelines and queries for performance and cost efficiency
- Maintain solid documentation and uphold data governance best practices
Key Skills:
- Proven experience with Data Vault 2.0 (essential)
- Hands-on expertise with Snowflake
- Strong SQL skills and pipeline development
- Programming in Python, Java, or similar
- Experience with data streaming (Kafka, Kinesis) a bonus
- DBT experience is highly desirable
- A collaborative self-starter, comfortable in a dynamic, small-team environment
Salary (Rate): undetermined
City: London
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
We’re Hiring: Senior Software Engineer – Data Engineer
Location: Hybrid London
Contract: Contract
Are you a passionate Data Engineer with hands-on experience in Data Vault 2.0 and Snowflake ? Ready to work on a high-impact project shaping modern data platforms? Join our growing team to build out a robust Vault 2.0 architecture on Snowflake , enabling scalable, reliable data products that power smarter decisions for internal teams and clients.
What You’ll Be Doing:
- Design and build Data Vault 2.0 architecture on Snowflake
- Develop scalable data pipelines (ETL/ELT) integrating diverse sources (APIs, databases, streaming)
- Create data marts that power critical reporting and analytics
- Collaborate with analysts, engineers, and stakeholders to deliver business-ready data products
- Optimise pipelines and queries for performance and cost efficiency
- Maintain solid documentation and uphold data governance best practices
What We’re Looking For:
- Proven experience with Data Vault 2.0 (essential)
- Hands-on expertise with Snowflake
- Strong SQL skills and pipeline development
- Programming in Python, Java, or similar
- Experience with data streaming (Kafka, Kinesis) a bonus
- DBT experience is highly desirable
- A collaborative self-starter , comfortable in a dynamic, small-team environment
Bonus Points For:
- Financial Services or Asset Management data experience
- Strong understanding of modern data architecture principles
Why Join Us?
- Work on a major transformation project with real impact
- Opportunity to shape data architecture from the ground up
- Small, collaborative, high-performing team culture
- Hybrid working with flexibility
Interested? Let’s chat. Apply now or reach out directly!
#DataEngineer #Snowflake #DataVault #DataEngineering #ETL #DBT #DataPlatform #Hiring #TechJobs #LondonJobs #HybridJobs