Negotiable
Undetermined
Hybrid
London Area, United Kingdom
Summary: The Senior Data Tester role focuses on manual data testing within the insurance sector, specifically leveraging Azure and Databricks technologies. The position requires strong analytical skills to validate data and a deep understanding of the London Market insurance landscape. Candidates should possess advanced SQL and Python skills, along with experience in data validation frameworks. The role is hybrid, requiring 2-3 days per week in the office in London.
Key Responsibilities:
- Conduct manual data testing and validation independently.
- Analyze and validate data with a strong understanding of London Market insurance.
- Utilize advanced SQL for complex data queries and performance tuning.
- Work hands-on with the Azure ecosystem and Databricks.
- Employ PySpark and Spark SQL for data validation tasks.
- Implement Azure Data Services for data management and processing.
- Develop automation and reusable validation frameworks using Python/scripting.
- Ensure data quality dimensions are met, including completeness and accuracy.
- Understand policy and claim identifiers in reconciliation contexts.
Key Skills:
- Data Testing
- London Market insurance
- Policy and Claims knowledge
- Data validation expertise
- Experience with Databricks
- Advanced SQL skills
- Proficiency in Python/scripting
- Understanding of data quality dimensions
- Experience with Azure Data Services
Salary (Rate): undetermined
City: London
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
I am hiring for Senior Data Tester – Insurance (Azure & Data bricks) Location: London - Hybrid / 2-3 days Per Week in Office Strong manual data testing expertise with the ability to independently analyze and validate data. Strong understanding of London Market insurance (syndicates, brokers, cove holders, bordereaux). Advanced SQL skills (complex joins, CTEs, window functions, performance tuning). Hands-on experience with Azure ecosystem and Data bricks. Proficiency in PySpark and Spark SQL for data validation. Experience with Azure Data Services (ADLS, Azure Data Factory, Synapse/Data bricks patterns). Strong Python/scripting skills for automation and reusable validation frameworks. Solid understanding of data quality dimensions (completeness, accuracy, consistency, uniqueness, timeliness). Understanding of policy/claim identifiers, endorsements, and adjustments in reconciliation contexts.
Key Skills: Data Testing / London Market insurance / Policy / Claims / Data validation / Data bricks