Senior Data Engineer

Senior Data Engineer

Posted Today by Pontoon

£550 Per day
Inside
Hybrid
Warwick, Warwickshire

Summary: The Senior Data Engineer role at Pontoon involves enhancing the Interconnectors Data Platform (ICDP) for a client in the Utilities sector. The position requires a blend of technical expertise in data architecture and leadership skills to drive innovation in a hybrid working environment. This temporary role spans three months with potential for extension, focusing on modernizing data ingestion and transformation processes. Candidates should possess strong experience in Azure and data pipeline development.

Key Responsibilities:

  • Lead the design and implementation of scalable data architectures using Bronze/Silver/Gold layered models.
  • Shape the platform's architectural roadmap, ensuring alignment with cutting-edge engineering practices.
  • Develop secure and observable ingestion and transformation pipelines.
  • Spearhead the migration from legacy ETL tools to modern Azure-based pipelines.
  • Build and maintain high-performance SQL transformations, curated layers, and reusable data models.
  • Embed CI/CD, testing, version control, and observability into workflows.
  • Ensure robust data validation, reconciliation, profiling, and auditability across platform layers.
  • Collaborate with business stakeholders to guarantee analytical and operational needs are met.
  • Mentor fellow data engineers, fostering technical growth within the ICDP team.
  • Collaborate with Product teams, IT&D, and external partners to achieve high-quality outcomes.
  • Serve as a technical authority on engineering approaches, patterns, and standards.

Key Skills:

  • Strong hands-on experience in Python for building production-grade data pipelines.
  • Expert-level skills in analytical SQL, query optimization, and data modeling.
  • Familiarity with Azure Functions, Azure Data Factory, Azure Storage, and cloud security fundamentals.
  • In-depth understanding of data architecture principles and scalable enterprise data design.
  • Proficient in Git, CI/CD, automated testing, and modern engineering practices.
  • Experience with API ingestion, SFTP ingestion, and resilient pipeline design.
  • Exceptional problem-solving and architectural thinking abilities.
  • Strong communication and stakeholder collaboration skills.
  • Capability to lead and provide clarity in complex technical environments.

Salary (Rate): £550/day

City: Warwick

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: inside IR35

Seniority Level: Senior

Industry: IT

Detailed Description From Employer:

Pontoon is an employment consultancy. We put expertise, energy, and enthusiasm into improving everyone's chance of being part of the workplace. We respect and appreciate people of all ethnicities, generations, religious beliefs, sexual orientations, gender identities, and more. We do this by showcasing their talents, skills, and unique experience in an inclusive environment that helps them thrive.

Join Our Team as a Senior Data Engineer!

Are you a passionate Data Engineer with a flair for innovation? Do you thrive in a dynamic environment where your skills can shape the future of data architecture? If so, we have the perfect opportunity for you! Our client, a leader in the Utilities sector, is seeking a Senior Data Engineer for a temporary role of 3 months.

Role: Senior Data Engineer

Duration: 3 Months (extension options)

Location: Warwick (Hybrid - 1 day on site)

Rate: £500-£550 per day (umbrella)

Role Overview: As a Senior Data Engineer, you will play a pivotal role in enhancing the Interconnectors Data Platform (ICDP), a cloud-based data warehouse essential for commercial, financial modeling, and operational decision-making. With the platform evolving towards a modernized Medallion Architecture and Azure-native ingestion patterns, your expertise will drive architectural direction and technical leadership.

Key Responsibilities:

  • Data Architecture & Platform Engineering:
  • Lead the design and implementation of scalable data architectures using Bronze/Silver/Gold layered models.
  • Shape the platform's architectural roadmap, ensuring alignment with cutting-edge engineering practices.
  • Develop secure and observable ingestion and transformation pipelines.
  • Pipeline Development & Operations:
  • Spearhead the migration from legacy ETL tools to modern Azure-based pipelines, using Azure Functions, Azure Data Factory (ADF), and event-driven frameworks.
  • Build and maintain high-performance SQL transformations, curated layers, and reusable data models.
  • Embed CI/CD, testing, version control, and observability into workflows.
  • Data Quality & Governance:
  • Ensure robust data validation, reconciliation, profiling, and auditability across platform layers.
  • Collaborate with business stakeholders to guarantee analytical and operational needs are met.
  • Leadership:
  • Mentor fellow data engineers, fostering technical growth within the ICDP team.
  • Collaborate with Product teams, IT&D, and external partners to achieve high-quality outcomes.
  • Serve as a technical authority on engineering approaches, patterns, and standards.

Required Skills & Experience:

Essential Technical Skills:

  • Python: Strong hands-on experience in building production-grade data pipelines and orchestration.
  • Advanced SQL: Expert-level skills in analytical SQL, query optimization, and data modeling.
  • Azure Cloud: Familiarity with Azure Functions, Azure Data Factory, Azure Storage, and cloud security fundamentals.
  • Data Warehousing: In-depth understanding of data architecture principles and scalable enterprise data design.
  • Version Control: Proficient in Git, CI/CD, automated testing, and modern engineering practices.
  • Pipeline Design: Experience with API ingestion, SFTP ingestion, and resilient pipeline design.

Soft Skills:

  • Exceptional problem-solving and architectural thinking abilities.
  • Strong communication and stakeholder collaboration skills.
  • Capability to lead and provide clarity in complex technical environments.

Desirable Experience:

  • Involvement in data-platform re-architecture programs.
  • Exposure to Medallion/Lakehouse patterns or Databricks-style ecosystems.
  • Experience in regulated or high-assurance data environments.

Why Join Us?

This is your chance to be part of a transformative journey in the Utilities industry! Not only will you be enhancing your skills, but you will also contribute to a vital platform that impacts decision-making at every level.

If you're ready to take on this exciting challenge and make a significant impact, we want to hear from you! Apply now and become a key player in our client's innovative team!

Candidates will ideally show evidence of the above in their CV to be considered.

Please be advised if you haven't heard from us within 48 hours then unfortunately your application has not been successful on this occasion, we may however keep your details on file for any suitable future vacancies and contact you accordingly.

We use generative AI tools to support our candidate screening process. This helps us ensure a fair, consistent, and efficient experience for all applicants. Rest assured, all final decisions are made by our hiring team, and your application will be reviewed with care and attention.