Negotiable
Undetermined
Undetermined
England, United Kingdom
Summary: HCL is looking for a Data Engineer with expertise in Graph Databases, specifically Neo4j, to design and implement data platforms for EMEA clients using GCP services. The role involves building data pipelines, optimizing graph database solutions, and developing ETL workflows. Candidates should have significant experience in data engineering and a strong understanding of data architecture principles. A Google Professional Data Engineer certification or equivalent is required.
Key Responsibilities:
- Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources.
- Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations.
- Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage.
- Design graph models for real-world applications such as fraud detection, network analysis, and knowledge graphs.
- Optimize performance of graph queries and design for scalability.
- Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments.
- Implement metadata management, security, and data governance using Data Catalog and IAM.
- Work across functional teams and clients in diverse EMEA time zones and project settings.
Key Skills:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform.
- Proficiency in SQL, Python, and Cypher query language.
- Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
- Knowledge of graph theory, graph schema modeling, and data relationship mapping.
- Fluent in English.
- Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data, Azure Data Engineer).
Salary (Rate): undetermined
City: England
Country: United Kingdom
Working Arrangements: undetermined
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
Job Description: Data Engineer
Location: UKI
Language Requirement: Fluent in English
Certification Required: Google Professional Data Engineer or equivalent
About the Role
HCL is seeking a highly skilled Data Engineer with hands-on experience in Graph Databases (Neo4j) and modern data ingestion and optimization techniques. You will help EMEA clients design intelligent data platforms leveraging GCP services to support complex, connected data use cases and drive performance at scale.
Key Responsibilities
- Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources.
- Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations.
- Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage.
- Design graph models for real-world applications such as fraud detection, network analysis, and knowledge graphs.
- Optimize performance of graph queries and design for scalability.
- Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments.
- Implement metadata management, security, and data governance using Data Catalog and IAM.
- Work across functional teams and clients in diverse EMEA time zones and project settings.
Minimum Qualifications
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform.
- Proficiency in SQL, Python, and Cypher query language.
- Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
- Knowledge of graph theory, graph schema modeling, and data relationship mapping.
- Fluent in English.
- Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data, Azure Data Engineer).
Preferred Qualifications
- Overall 8 to 10+ years of experience
- Experience in creating open-source Graph DB (Neo4j), data ingestion and optimization techniques
- Familiarity with Graph Data Science libraries in Neo4j.
- Understanding of data architecture principles, data mesh, and distributed processing.
- Prior experience in customer-facing roles or professional services.
- Background in data security, compliance (e.g., GDPR), and regional data residency awareness.