Principal Data Engineer

Principal Data Engineer

Posted 1 week ago by JIM - Jobs In Manchester

£87,000 Per year
Undetermined
Undetermined
Salford, England, United Kingdom

Summary: The Principal Data Engineer will support the Product Data Domain teams at the BBC by building ETL pipelines to ingest and transform data for key value use cases. This role involves collaboration within an agile multi-disciplinary team to ensure maximum value delivery. The position is permanent and full-time, based in Salford, England, with a focus on enhancing data quality and governance. The role is critical in driving engagement across the BBC's diverse digital product portfolio.

Key Responsibilities:

  • Leads and architects on developing robust and scalable complex data pipelines to ingest, transform, and analyse large volumes of structured and unstructured data from diverse data sources.
  • Pipelines must be optimised for performance, reliability, and scalability in line with the BBC’s scale.
  • Lead initiatives to enhance data quality, governance and security across the organisation, ensuring compliance with BBC guidelines and industry best practices.
  • Prioritises stakeholders requirements and identify the best solution for timely delivery.
  • Leads on building automation workflows including monitoring and alerting.
  • Encouraging and mentoring team members in partnership with other disciplines to create value with data across the wider organisation.
  • Helps set standards for coding, testing and other engineering practices.
  • Leads on the building and testing of business continuity & disaster recovery procedures per requirements.
  • Proactively evaluates and provides feedback on future technologies and new releases/upgrades based on deep understanding of the domain.

Key Skills:

  • Extensive (5+ years) experience in a data engineering or analytics engineering role, preferably in digital products, building ETL pipelines, ingesting data from a diverse set of data sources (including event streams, various forms of batch processing).
  • Excellent SQL and python skills with experience in deploying and scheduling code bases in a data development environment, using technologies such as Airflow.
  • Good working knowledge of cloud-based Data Warehousing technologies (such as AWS Redshift, GCP BigQuery or Snowflake).
  • Demonstrable experience of working alongside cross-functional teams interacting with Product Managers, Infrastructure Engineers, Data Scientists, and Data Analysts.
  • Strong stakeholder management skills with the ability to prioritise and a structured approach and ability to bring others on the journey.

Salary (Rate): £87,000 yearly

City: Salford

Country: United Kingdom

Working Arrangements: undetermined

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

THE ROLE JOB PROFILE JOB BAND: D CONTRACT TYPE: Permanent, Full-time DEPARTMENT: Product Group LOCATION: Salford PROPOSED SALARY RANGE: up to £87,000 depending on relevant skills, knowledge and experience. The expected salary range for this role reflects internal benchmarking and external market insights. We’re happy to discuss flexible working. If you’d like to, please indicate your preference in the application – though there’s no obligation to do so now. Flexible working will be part of the discussion at offer stage.

PURPOSE OF THE ROLE The Principal Data Engineer is a role that will support the Product Data Domain teams. You will help to build ETL pipelines to ingest and transform data to develop the data products that will power key value use cases across the BBC. You will work in an agile multi-disciplinary team alongside product analytics developers, product data managers, data modelers and data operations managers, ensuring that all work delivers maximum value to the BBC.

WHY JOIN THE TEAM Product Group is responsible for the design, development, and delivery of the BBC’s portfolio of digital products. Including iPlayer, Sounds, Bitesize, and the BBC News and BBC Sport apps and website, our portfolio is diverse and contains some of the largest and highest-profile properties on the UK internet. We’re a huge streaming media destination, a news source trusted across the world, a provider of educational and entertaining content to children of all ages, and a sports results, analysis, and commentary service, and much more besides. It’s an unparalleled portfolio of products, and our strength is our range and breadth. Working with the BBC’s content divisions, our focus now is on driving engagement across our portfolio so that the BBC online becomes a valued daily habit for all audiences just as television and radio have been over the last century. Data is fundamental to our future: both in helping us prioritise and shape our work and in creating richer, more personalised experiences for our audiences. And our portfolio means that we’ve got one of the widest, most diverse, and most exciting datasets to work within the UK.

Your Key Responsibilities And Impact Leads and architects on developing robust and scalable complex data pipelines to ingest, transform, and analyse large volumes of structured and unstructured data from diverse data sources. Pipelines must be optimised for performance, reliability, and scalability in line with the BBC’s scale. Lead initiatives to enhance data quality, governance and security across the organisation, ensuring compliance with BBC guidelines and industry best practices. Prioritises stakeholders requirements and identify the best solution for timely delivery. Leads on building automation workflows including monitoring and alerting. Encouraging and mentoring team members in partnership with other disciplines to create value with data across the wider organisation. Helps set standards for coding, testing and other engineering practices. Leads on the building and testing of business continuity & disaster recovery procedures per requirements. Proactively evaluates and provides feedback on future technologies and new releases/upgrades based on deep understanding of the domain.

Essential Skills YOUR SKILLS AND EXPERIENCE Extensive (5+ years) experience in a data engineering or analytics engineering role, preferably in digital products, building ETL pipelines, ingesting data from a diverse set of data sources (including event streams, various forms of batch processing) Excellent SQL and python skills with experience in deploying and scheduling code bases in a data development environment, using technologies such as Airflow. Good working knowledge of cloud-based Data Warehousing technologies (such as AWS Redshift, GCP BigQuery or Snowflake) Demonstrable experience of working alongside cross-functional teams interacting with Product Managers, Infrastructure Engineers, Data Scientists, and Data Analysts Strong stakeholder management skills with the ability to prioritise and a structured approach and ability to bring others on the journey Desirable Skills Ability to listen to others’ ideas and build on them Ability to clearly communicate to both technical and non-technical audiences Ability to collaborate effectively, working alongside other team members towards the team’s goals, and enabling others to succeed, where possible Strong attention to detail

DISCLAIMER This job description is a written statement of the essential characteristics of the job, with its principal accountabilities, incorporating a note of the skills, knowledge and experience required for a satisfactory level of performance. This is not intended to be a complete, detailed account of all aspects of the duties involved.