Negotiable
Undetermined
Hybrid
London Area, United Kingdom
Summary: The Data Engineer role at Derisk360 involves designing and developing end-to-end data pipelines on Google Cloud Platform (GCP) while utilizing graph-based architectures. The position requires extensive experience in data engineering, particularly with Neo4j, and focuses on creating scalable solutions for enterprise clients across EMEA. Candidates should possess strong technical skills in SQL, Python, and various GCP services, along with relevant certifications. This is a remote/hybrid position based in the UK, specifically in the London area.
Key Responsibilities:
- Architect and develop end-to-end data pipelines on Google Cloud Platform (GCP), integrating structured, semi-structured, and unstructured data sources.
- Design and implement advanced Graph Database solutions using Neo4j, Cypher queries, and GCP-native integrations.
- Create ETL/ELT workflows leveraging GCP services including Dataflow, Pub/Sub, BigQuery, and Cloud Storage.
- Model real-world use cases in Neo4j such as fraud detection, knowledge graphs, and network analysis.
- Optimize graph database performance, ensure query scalability, and maintain system efficiency.
- Manage ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments.
- Implement metadata management, security, and data governance using Data Catalog and IAM.
- Collaborate with cross-functional teams and clients across diverse EMEA time zones and domains.
Key Skills:
- 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform.
- Proficiency in SQL, Python, and Cypher query language.
- Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
- Expertise in graph theory, graph schema modeling, and data relationship mapping.
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Fluent in English.
- Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data, Azure Data Engineer).
- Nice to Have: 8 to 10+ years of overall experience.
- Experience with open-source Graph DB tools and ingestion/optimization techniques.
- Familiarity with Graph Data Science libraries in Neo4j.
- Understanding of data architecture principles, data mesh, and distributed processing.
- Prior experience in customer-facing roles or professional services.
- Awareness of GDPR, data security, and regional data residency standards.
Salary (Rate): undetermined
City: London Area
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
We’re Hiring: Data Engineer
Location: UK (Flexible time zone, remote/hybrid available)
Experience: 8+ years
Contract: 6 Months
Language Requirement: Fluent in English
Certification Required: Google Professional Data Engineer or equivalent
Are you an experienced Data Engineer passionate about modern data platforms and graph-based architectures? Join Derisk360 to help enterprise clients across EMEA design intelligent, scalable solutions on GCP that unlock value from complex, connected datasets.
What You’ll Do:
- Architect and develop end-to-end data pipelines on Google Cloud Platform (GCP), integrating structured, semi-structured, and unstructured data sources.
- Design and implement advanced Graph Database solutions using Neo4j, Cypher queries, and GCP-native integrations.
- Create ETL/ELT workflows leveraging GCP services including Dataflow, Pub/Sub, BigQuery, and Cloud Storage.
- Model real-world use cases in Neo4j such as fraud detection, knowledge graphs, and network analysis.
- Optimize graph database performance, ensure query scalability, and maintain system efficiency.
- Manage ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments.
- Implement metadata management, security, and data governance using Data Catalog and IAM.
- Collaborate with cross-functional teams and clients across diverse EMEA time zones and domains.
What You Bring:
- 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform.
- Proficiency in SQL, Python, and Cypher query language.
- Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
- Expertise in graph theory, graph schema modeling, and data relationship mapping.
- Bachelor’s degree in Computer Science, Engineering, or a related field.
- Fluent in English.
- Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data, Azure Data Engineer).
- Nice to Have: 8 to 10+ years of overall experience.
- Experience with open-source Graph DB tools and ingestion/optimization techniques.
- Familiarity with Graph Data Science libraries in Neo4j.
- Understanding of data architecture principles, data mesh, and distributed processing.
- Prior experience in customer-facing roles or professional services.
- Awareness of GDPR, data security, and regional data residency standards.
What You’ll Get:
- Lead the design of mission-critical data platforms for clients across EMEA.
- Work on cutting-edge graph-based use cases including fraud detection and knowledge graphs.
- Hands-on experience with modern cloud-native and open-source technologies.
- Join a culture of innovation, engineering excellence, and continuous learning.