Negotiable
Undetermined
Remote
United Kingdom
Summary: The Data Engineer role focuses on leveraging GCP services to design intelligent data platforms for EMEA clients, with a strong emphasis on Graph Databases, particularly Neo4j. The position requires hands-on experience in data ingestion and optimization techniques to support complex data use cases. The role is contract-based and remote, lasting for an initial duration of 6 months with the possibility of extension. Candidates should possess a solid background in data engineering and relevant certifications.
Key Responsibilities:
- Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources.
- Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations.
- Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage.
- Design graph models for real-world applications such as fraud detection, network analysis, and knowledge graphs.
- Optimize performance of graph queries and design for scalability.
- Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments.
- Implement metadata management, security, and data governance using Data Catalog and IAM.
- Work across functional teams and clients in diverse EMEA time zones and project settings.
Key Skills:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform.
- Proficiency in SQL, Python, and Cypher query language.
- Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
- Knowledge of graph theory, graph schema modeling, and data relationship mapping.
- Fluent in English.
- Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data, Azure Data Engineer).
- Overall 8 to 10+ years of experience.
- Experience in creating open-source Graph DB (Neo4j), data ingestion and optimization techniques.
- Familiarity with Graph Data Science libraries in Neo4j.
- Understanding of data architecture principles, data mesh, and distributed processing.
- Prior experience in customer-facing roles or professional services.
- Background in data security, compliance (e.g., GDPR), and regional data residency awareness.
Salary (Rate): undetermined
City: undetermined
Country: United Kingdom
Working Arrangements: remote
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
Job Title: Data Engineer - GCP
Location: UK (Remote)
Duration: 6 Months (Extendable)
Employment Type: Contract B2B
Roles & Responsibilities:
We are seeking a highly skilled Data Engineer with hands-on experience in Graph Databases (Neo4j) and modern data ingestion and optimization techniques. You will help EMEA clients design intelligent data platforms leveraging GCP services to support complex, connected data use cases and drive performance at scale.
- Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources.
- Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations.
- Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage.
- Design graph models for real-world applications such as fraud detection, network analysis, and knowledge graphs.
- Optimize performance of graph queries and design for scalability.
- Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments.
- Implement metadata management, security, and data governance using Data Catalog and IAM.
- Work across functional teams and clients in diverse EMEA time zones and project settings.
Minimum Qualifications
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform.
- Proficiency in SQL, Python, and Cypher query language.
- Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
- Knowledge of graph theory, graph schema modeling, and data relationship mapping.
- Fluent in English.
- Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data, Azure Data Engineer).
Preferred Qualifications
- Overall 8 to 10+ years of experience
- Experience in creating open-source Graph DB (Neo4j), data ingestion and optimization techniques
- Familiarity with Graph Data Science libraries in Neo4j.
- Understanding of data architecture principles, data mesh, and distributed processing.
- Prior experience in customer-facing roles or professional services.
- Background in data security, compliance (e.g., GDPR), and regional data residency awareness.