Data modeler with insurance and snowflake
Posted 1 week ago by DELTACLASS TECHNOLOGY SOLUTIONS LIMITED
Negotiable
Undetermined
Undetermined
Greater London, England, United Kingdom
Summary: The role of Data Modeler focuses on supporting the transition to a data-driven operating model within the insurance sector by developing scalable data models and enterprise-scale analytics capabilities. The position requires collaboration with data engineers and business teams to ensure high-quality, secure data for analytics and compliance with governance standards. The Data Modeler will also be responsible for optimizing data models and contributing to the overall data architecture. Strong expertise in insurance data and advanced data modeling skills are essential for success in this role.
Key Responsibilities:
- Support the organisation’s shift to a data-driven operating model by translating business needs into scalable data models.
- Build enterprise-scale, cloud-hosted analytics capabilities.
- Enable trusted, secure, and high-quality data for analytics/BI.
- Work across insurance data domains (policy, claims, underwriting, finance).
- Partner closely with data engineers, analysts, and business teams.
- Ensure data platforms meet performance, governance, and compliance needs.
- Operate within an Agile, collaborative delivery environment.
- Design and maintain conceptual, logical, and physical data models.
- Apply star and snowflake schemas for analytical workloads.
- Analyse business requirements and translate them into data models.
- Contribute to enterprise data architecture and platform design.
- Work with data engineers to implement models using dbt.
- Optimise data models and tables for query performance.
- Collaborate across teams using Agile tools and ways of working.
- Ensure data models comply with governance and GDPR standards.
- Maintain data documentation, dictionaries, and lineage.
Key Skills:
- Strong experience in the insurance data domain.
- Advanced data modelling expertise across multiple schemas.
- Hands-on experience with data warehousing and data lakes.
- Proficiency in SQL, including performance tuning.
- Experience with AWS data services (Athena, Redshift).
- Understanding of analytical and transactional databases.
- Experience aligning data architecture with GDPR requirements.
- Strong communication skills with technical and non-technical stakeholders.
- Knowledge of dbt and Airflow for data pipelines is a bonus.
Salary (Rate): undetermined
City: Greater London
Country: United Kingdom
Working Arrangements: undetermined
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
Overview of role
- Support the organisation’s shift to a data-driven operating model by translating business needs into scalable data models
- Build enterprise-scale, cloud-hosted analytics capabilities
- Enable trusted, secure, and high-quality data for analytics/ BI
- Work across insurance data domains (policy, claims, underwriting, finance)
- Partner closely with data engineers, analysts, and business teams
- Ensure data platforms meet performance, governance, and compliance needs
- Operate within an Agile, collaborative delivery environment
Craft & Skills
- Design and maintain conceptual, logical, and physical data models
- Apply star and snowflake schemas for analytical workloads
- Analyse business requirements and translate them into data models
- Contribute to enterprise data architecture and platform design
- Work with data engineers to implement models using dbt
- Optimise data models and tables for query performance
- Collaborate across teams using Agile tools and ways of working
- Ensure data models comply with governance and GDPR standards
- Maintain data documentation, dictionaries, and lineage
Strong experience in the insurance data domain
Advanced data modelling expertise across multiple schemas
Hands-on experience with data warehousing and data lakes
Proficiency in SQL, including performance tuning
Experience with AWS data services (Athena, Redshift)
Understanding of analytical and transactional databases
Experience aligning data architecture with GDPR requirements
Strong communication skills with technical and non-technical stakeholders
Knowledge of dbt and Airflow for data pipelines is a bonus