Negotiable
Undetermined
Undetermined
Warwick, England, United Kingdom
Summary: The Senior Data Engineer role involves designing, developing, and maintaining scalable data pipelines using tools like Databricks and Azure Data Factory. The position requires ensuring data quality and consistency while collaborating with Data Architects and Analysts. Responsibilities also include troubleshooting data issues and mentoring team members to enhance their capabilities within the data engineering function.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines using Databricks, Azure Data Factory, and Python.
- Maintain data quality, consistency, and lineage across ingestion, transformation, and delivery layers.
- Implement orchestration, scheduling, and monitoring to ensure reliable data operations.
- Work with Data Architects and Analysts to align pipelines with data models and target architecture.
- Resolve data issues across development and production environments to maintain platform stability.
- Maintain clear technical documentation and contribute to shared engineering knowledge.
- Coach and support team members, helping to raise capability across the data engineering function.
Key Skills:
- Experience with Databricks and Azure Data Factory.
- Proficiency in Python programming.
- Strong understanding of data quality and consistency principles.
- Ability to implement orchestration and monitoring solutions.
- Collaboration skills to work with Data Architects and Analysts.
- Problem-solving skills for troubleshooting data issues.
- Experience in technical documentation and knowledge sharing.
- Mentoring and coaching abilities.
Salary (Rate): undetermined
City: Warwick
Country: United Kingdom
Working Arrangements: undetermined
IR35 Status: undetermined
Seniority Level: Senior
Industry: IT
Build & Deliver: Design, develop, and maintain scalable data pipelines using Databricks, Azure Data Factory, and Python. Knowledge in Ensure Quality: Maintain data quality, consistency, and lineage across ingestion, transformation, and delivery layers. Orchestrate & Monitor: Implement orchestration, scheduling, and monitoring to ensure reliable data operations. Collaborate & Align: Work with Data Architects and Analysts to align pipelines with data models and target architecture. Troubleshoot & Optimise: Resolve data issues across development and production environments to maintain platform stability. Document & Share: Maintain clear technical documentation and contribute to shared engineering knowledge. Support & Mentor: Coach and support team members, helping to raise capability across the data engineering function.