£440 Per day
Outside
Hybrid
Bristol
Summary: The role of DV Cleared Data Engineer focuses on designing and deploying secure, high-performance data solutions using the Elastic Stack and Apache NiFi. This contract position requires a technical expert to build robust, real-time data pipelines in a security-focused environment, particularly within regulated sectors. The ideal candidate will have a strong track record in data engineering and a hands-on approach to managing data flows. This opportunity is suited for professionals looking to make a significant impact in their field.
Key Responsibilities:
- Designing and deploying scalable, secure data pipelines using Elasticsearch, Logstash, Kibana, and Apache NiFi
- Handling real-time data ingestion and transformation with an emphasis on integrity and availability
- Collaborating with architects and cybersecurity stakeholders to align with governance and compliance needs
- Monitoring and optimising high-throughput data flows across on-prem and cloud environments
- Building insightful Kibana dashboards to support business intelligence and operational decision-making
- Maintaining documentation of data flows, architecture, and security procedures to ensure audit-readiness
Key Skills:
- Minimum 3 years' experience as a Data Engineer in sensitive or regulated industries
- Proficiency in the full Elastic Stack for data processing, analytics, and visualisation
- Hands-on expertise with Apache NiFi in designing sophisticated data workflows
- Solid scripting capabilities using Python, Bash, or similar
- Familiarity with best practices in data protection (encryption, anonymisation, access control)
- Experience managing large-scale, real-time data pipelines
- Working knowledge of cloud services (AWS, Azure, GCP), especially around secure deployment
- Background in government, defence, or highly regulated sectors (nice-to-have)
- Exposure to big data tools like Kafka, Spark, or Hadoop (nice-to-have)
- Understanding of containerisation and orchestration (e.g. Docker, Kubernetes) (nice-to-have)
- Familiarity with infrastructure as code tools (e.g. Terraform, Ansible) (nice-to-have)
- Experience building monitoring solutions with Prometheus, Grafana, or ELK (nice-to-have)
- Interest in or exposure to machine learning-driven data systems (nice-to-have)
Salary (Rate): £430.00 daily
City: Bristol
Country: United Kingdom
Working Arrangements: Hybrid
IR35 Status: Outside IR35
Seniority Level: Mid-Level
Industry: IT
Detailed Description From Employer:
DV (MOD) Cleared Data Engineer - Elastic Stack & Apache NiFi
Location: Bristol | Contract Type: £430.00 pd (Outside IR35) | Working Pattern: Hybrid (3 - 4 days on-site)
Are you a contract Data Engineer with a knack for designing secure, high-performance data solutions? We're on the lookout for a technical expert in the Elastic Stack and Apache NiFi to take the lead in building robust, real-time data pipelines in a security-focused environment.
This is a hands-on contract opportunity to make a real impact-ideal for professionals with a strong track record in regulated sectors.
What You'll Be Doing
Designing and deploying scalable, secure data pipelines using Elasticsearch, Logstash, Kibana, and Apache NiFi
Handling real-time data ingestion and transformation with an emphasis on integrity and availability
Collaborating with architects and cybersecurity stakeholders to align with governance and compliance needs
Monitoring and optimising high-throughput data flows across on-prem and cloud environments
Building insightful Kibana dashboards to support business intelligence and operational decision-making
Maintaining documentation of data flows, architecture, and security procedures to ensure audit-readiness
Your Experience
Must-Have:
Minimum 3 years' experience as a Data Engineer in sensitive or regulated industries
Proficiency in the full Elastic Stack for data processing, analytics, and visualisation
Hands-on expertise with Apache NiFi in designing sophisticated data workflows
Solid scripting capabilities using Python, Bash, or similar
Familiarity with best practices in data protection (encryption, anonymisation, access control)
Experience managing large-scale, real-time data pipelines
Working knowledge of cloud services (AWS, Azure, GCP), especially around secure deployment
Nice-to-Have:
Background in government, defence, or highly regulated sectors
Exposure to big data tools like Kafka, Spark, or Hadoop
Understanding of containerisation and orchestration (e.g. Docker, Kubernetes)
Familiarity with infrastructure as code tools (e.g. Terraform, Ansible)
Experience building monitoring solutions with Prometheus, Grafana, or ELK
Interest in or exposure to machine learning-driven data systems
