DataWorks DevOps Engineer

DataWorks DevOps Engineer

Posted 4 days ago by Stealth IT Consulting

£445 Per day
Inside
Remote
Remote , UK

Summary: We are looking for a senior DataWorks DevOps Engineer to enhance public sector data and analytics projects. The role involves building, deploying, and managing extensive data pipelines and infrastructure primarily in AWS cloud environments. Candidates must possess SC security clearance and be prepared for additional vetting. This position is fully remote and offers a maximum day rate of £445 inside IR35.

Key Responsibilities:

  • Build, deploy, and manage CI/CD pipelines using GitLab CI or Concourse.
  • Design and implement infrastructure as code using Terraform.
  • Support big data tooling: Hadoop, Hive, Spark, and ideally Amazon EMR or equivalents.
  • Manage Linux-based systems, containerised workloads (Kubernetes/EKS optional).
  • Write scripts and automation tools using Python.
  • Ensure secure, reliable, and scalable deployment and monitoring of data systems.

Key Skills:

  • AWS: VPC, EC2, S3, Lambda, EMR, Athena, Redshift, Security Groups, ALB/ELB, KMS, and other relevant services
  • CI/CD pipelines: GitLab CI, Concourse
  • Infrastructure as code: Terraform
  • Big Data: Hadoop, Hive, Spark, Amazon EMR
  • Linux administration and Shell Scripting
  • Containerisation: Kubernetes/EKS (helpful but not mandatory)
  • Python Scripting (advantageous)

Salary: £445/day

City: undetermined

Country: UK

Working Arrangements: remote

IR35 Status: inside IR35

Seniority Level: Senior

Industry: IT

Detailed Description From Employer:

DataWorks DevOps Engineer - Public Sector

Max Day Rate: £445/day inside IR35
Location: Remote
Duration: 6 months
Security Clearance: SC required (additional vetting required; onboarding may take longer than usual)
Sector: Public

Job Description:

We are seeking a senior DataWorks DevOps Engineer to support public sector data and analytics initiatives. This role focuses on building, deploying, and managing large-scale data pipelines and infrastructure in cloud environments, primarily AWS.

Key Responsibilities:

  • Build, deploy, and manage CI/CD pipelines using GitLab CI or Concourse.
  • Design and implement infrastructure as code using Terraform.
  • Support big data tooling: Hadoop, Hive, Spark, and ideally Amazon EMR or equivalents.
  • Manage Linux-based systems, containerised workloads (Kubernetes/EKS optional).
  • Write scripts and automation tools using Python.
  • Ensure secure, reliable, and scalable deployment and monitoring of data systems.

Core Skills & Experience:

  • AWS: VPC, EC2, S3, Lambda, EMR, Athena, Redshift, Security Groups, ALB/ELB, KMS, and other relevant services
  • CI/CD pipelines: GitLab CI, Concourse
  • Infrastructure as code: Terraform
  • Big Data: Hadoop, Hive, Spark, Amazon EMR
  • Linux administration and Shell Scripting
  • Containerisation: Kubernetes/EKS (helpful but not mandatory)
  • Python Scripting (advantageous)

Desirable/Additional Skills:

  • Network and connectivity management, Kong
  • Code and security analysis (SonarQube, Tenable, Vault)
  • Monitoring & logging: Prometheus, Grafana, Splunk, Stackdriver
  • Databases: DynamoDB, MySQL, Redshift
  • Azure infrastructure experience (VM, Blob, PaaS, networking)

Notes:

  • SC clearance is mandatory; additional vetting required.
  • Interview process: Technical (Level IV).
  • Fully remote role.

If this matches your skills and experience, please apply today!