Data Engineer

Data Engineer

Posted 5 days ago by Square One Resources

£394 Per day
Inside
Hybrid
Birmingham, UK

Summary: The Data Engineer role involves working within a cross-functional delivery team to design, implement, and maintain robust data pipelines and infrastructure for a financial services project. The position requires strong DevOps knowledge and hands-on problem-solving skills, with a focus on cloud-based and containerized environments. The contract is set to last until October 31, 2025, with a start date of July 31, 2025. The role is based in Birmingham and offers a hybrid working arrangement.

Key Responsibilities:

  • Design, implement, and maintain CICD pipelines and related automation tooling.
  • Develop and manage Python-based applications and APIs for data operations.
  • Build and support data engineering workflows including ETL/ELT pipelines.
  • Operate within cloud-based and containerised environments with monitoring and logging tools.
  • Contribute to infrastructure management, including secrets management and networking configuration.
  • Provide operational support for large-scale big data platforms and services.

Key Skills:

  • Strong DevOps engineering skills including GitHub, Jenkins, GitHub Actions, Nexus, and SonarQube.
  • Proficiency in Linux environments with Scripting knowledge (Groovy, Bash, Python).
  • Solid Python development experience, particularly with Flask, Dash, Pandas, and NumPy.
  • Expertise in data engineering tools and frameworks such as Spark, Airflow, SQL/NoSQL, and Delta Lake.
  • Experience with cloud infrastructure (GCP or internal cloud), Docker, Kubernetes, and Argo CD.
  • Familiarity with secrets management tools like Vault and networking protocols (TCP/IP, SSH, TLS).

Salary (Rate): £393.75 per day

City: Birmingham

Country: UK

Working Arrangements: hybrid

IR35 Status: inside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Job Title: Data Engineer
Location: Birmingham (Hybrid/Onsite as required)
Salary/Rate: £393.75 per day (Inside IR35)
Start Date: 31/07/2025
Job Type: Contract (until 31/10/2025)

Company Introduction

We have an exciting opportunity available with one of our leading consultancy partners working on a financial services project. They are currently seeking an experienced Data Engineer to join their growing DevOps and Data Platforms team on a contract basis.

Job Responsibilities/Objectives

As a Data Engineer, you will be working within a cross-functional delivery team to build, manage and support robust data pipelines and infrastructure. The role requires strong DevOps knowledge and a hands-on approach to problem-solving.

1. Design, implement, and maintain CICD pipelines and related automation tooling.
2. Develop and manage Python-based applications and APIs for data operations.
3. Build and support data engineering workflows including ETL/ELT pipelines.
4. Operate within cloud-based and containerised environments with monitoring and logging tools.
5. Contribute to infrastructure management, including secrets management and networking configuration.
6. Provide operational support for large-scale big data platforms and services.

Required Skills/Experience

1. Strong DevOps engineering skills including GitHub, Jenkins, GitHub Actions, Nexus, and SonarQube.
2. Proficiency in Linux environments with Scripting knowledge (Groovy, Bash, Python).
3. Solid Python development experience, particularly with Flask, Dash, Pandas, and NumPy.
4. Expertise in data engineering tools and frameworks such as Spark, Airflow, SQL/NoSQL, and Delta Lake.
5. Experience with cloud infrastructure (GCP or internal cloud), Docker, Kubernetes, and Argo CD.
6. Familiarity with secrets management tools like Vault and networking protocols (TCP/IP, SSH, TLS).

Desirable Skills/Experience

1. Experience with MLFlow, Starburst, S3 buckets, and Postgres databases.
2. Working knowledge of data formats like Parquet and Avro, and optimisation practices like partitioning.
3. Familiarity with Service Mesh technologies (eg, Istio) and monitoring stacks (eg, ELK).
4. Exposure to managing Hadoop or Spark clusters and experience with JVM/JDK and Cloudera.

If you are interested in this opportunity, please apply now with your updated CV in Microsoft Word/PDF format.

Disclaimer
Notwithstanding any guidelines given to level of experience sought, we will consider candidates from outside this range if they can demonstrate the necessary competencies.

Square One is acting as both an employment agency and an employment business, and is an equal opportunities recruitment business. Square One embraces diversity and will treat everyone equally. Please see our website for our full diversity statement.