Negotiable
Undetermined
Undetermined
Belfast, Northern Ireland, United Kingdom
Summary: The Data Specialist role within the Markets Program Execution & Transformation team focuses on delivering robust data solutions and managing changes in response to regulatory requirements. This position involves developing PySpark and SQL queries, contributing to architecture discussions, and leading high-impact projects. The role requires collaboration with various stakeholders to ensure efficient and scalable data processes. The ideal candidate will have a strong technical background in data analysis and a proven ability to communicate complex issues effectively.
Key Responsibilities:
- Develop PySpark and SQL queries to analyze, reconcile, and interrogate data.
- Provide actionable recommendations to improve reporting processes—e.g., enhancing data quality, streamlining workflows, and optimizing query performance.
- Contribute to architecture and design discussions in a Hadoop-based environment.
- Translate high-level architecture and requirements into detailed design and code.
- Lead and guide complex, high-impact projects across all stages of development and implementation while ensuring adherence to key processes.
- Champion continuous improvement in areas such as code quality, testability, and system reliability.
- Act as a subject matter expert (SME) to senior stakeholders and cross-functional teams.
- Produce key project documentation, including Business Requirements Documents (BRDs), Functional Requirements Documents (FRDs), UAT plans, test scenarios, and project plans for technical deliverables.
- Manage day-to-day project activities, including setting milestones, tracking tasks, coordinating deliverables, and ensuring timely, high-quality execution.
Key Skills:
- Proficiency in SQL, Python, and Spark.
- Minimum 5 years of hands-on technical data analysis experience.
- Familiarity with Hadoop/Big Data environments.
- Understanding of Data Warehouse/ETL design and development methodologies.
- Ability to perform under pressure and adapt to changing priorities or requirements.
- Strong communication skills—capable of producing detailed documentation and translating complex technical issues for non-technical audiences.
- Self-motivated and able to work independently.
Salary (Rate): undetermined
City: Belfast
Country: United Kingdom
Working Arrangements: undetermined
IR35 Status: undetermined
Seniority Level: undetermined
Industry: Other
Markets Program Execution & Transformation – Data Acquisition Team The Markets Program Execution & Transformation team partners with all Global Markets businesses and various functions—including Legal, Compliance, Finance, and Operations & Technology—to identify, mobilize, and deliver regulatory and cross-business transformation initiatives. The team's core mission is to design and implement integrated solutions that are efficient, scalable, and client-focused. This role sits within the Data Acquisition Team , playing a critical part in supporting multiple projects and workstreams. The focus is on assessing and delivering robust data solutions and managing changes that impact diverse stakeholder groups in response to regulatory rulemaking, supervisory requirements, and discretionary transformation programs.
Key Responsibilities:
- Develop PySpark and SQL queries to analyze, reconcile, and interrogate data.
- Provide actionable recommendations to improve reporting processes—e.g., enhancing data quality, streamlining workflows, and optimizing query performance.
- Contribute to architecture and design discussions in a Hadoop-based environment.
- Translate high-level architecture and requirements into detailed design and code.
- Lead and guide complex, high-impact projects across all stages of development and implementation while ensuring adherence to key processes.
- Champion continuous improvement in areas such as code quality, testability, and system reliability.
- Act as a subject matter expert (SME) to senior stakeholders and cross-functional teams.
- Produce key project documentation, including Business Requirements Documents (BRDs), Functional Requirements Documents (FRDs), UAT plans, test scenarios, and project plans for technical deliverables.
- Manage day-to-day project activities, including setting milestones, tracking tasks, coordinating deliverables, and ensuring timely, high-quality execution.
Required Skills & Experience:
- Proficiency in SQL, Python, and Spark.
- Minimum 5 years of hands-on technical data analysis experience.
- Familiarity with Hadoop/Big Data environments.
- Understanding of Data Warehouse/ETL design and development methodologies.
- Ability to perform under pressure and adapt to changing priorities or requirements.
- Strong communication skills—capable of producing detailed documentation and translating complex technical issues for non-technical audiences.
- Self-motivated and able to work independently.
Preferred Qualifications:
- Background in investment banking or financial services.
- Hands-on experience with Hive, Impala, and the Spark ecosystem (e.g., HDFS, Apache Spark, Spark-SQL, UDFs, Sqoop).
- Proven experience building and optimizing big data pipelines, architectures, and data sets.