Negotiable
Undetermined
Remote
United Kingdom
Summary: The GCP Data Engineer role involves architecting and building data pipelines on Google Cloud Platform (GCP) while integrating various data sources. The position requires designing and optimizing graph database solutions, developing ETL/ELT workflows, and supporting large-scale data ingestion. Candidates should possess strong technical skills in data engineering and graph databases, along with experience in collaborative environments across diverse time zones. This is a contract position based in London, UK, with remote working options available.
Key Responsibilities:
- Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources.
- Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations.
- Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage.
- Design graph models for real-world applications such as fraud detection, network analysis, and knowledge graphs.
- Optimize performance of graph queries and design for scalability.
- Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments.
- Implement metadata management, security, and data governance using Data Catalog and IAM.
- Work across functional teams and clients in diverse EMEA time zones and project settings.
Key Skills:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform.
- Proficiency in SQL, Python, and Cypher query language.
- Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
- Knowledge of graph theory, graph schema modeling, and data relationship mapping.
- Fluent in English.
- Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data, Azure Data Engineer).
Salary (Rate): undetermined
City: undetermined
Country: United Kingdom
Working Arrangements: remote
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
Hi Professionals, Greetings From Ampstek!! Hope you are high in your spirits. Please stay safe. Our client is looking for GCP Data Engineer to join a high-growth organization. If you are interested share you resume sudhakaran.m@ampstek.com
Role: GCP Data Engineer
Location: London, UK (Remote)
Duration: Contract
Job Description:
Key Responsibilities
- Architect and build data pipelines on GCP integrating structured, semi-structured, and unstructured data sources.
- Design, implement, and optimize Graph Database solutions using Neo4j, Cypher queries, and GCP integrations.
- Develop ETL/ELT workflows using Dataflow, Pub/Sub, BigQuery, and Cloud Storage.
- Design graph models for real-world applications such as fraud detection, network analysis, and knowledge graphs.
- Optimize performance of graph queries and design for scalability.
- Support ingestion of large-scale datasets using Apache Beam, Spark, or Kafka into GCP environments.
- Implement metadata management, security, and data governance using Data Catalog and IAM.
- Work across functional teams and clients in diverse EMEA time zones and project settings.
Minimum Qualifications
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering, including 2+ years with Neo4j or another Graph DB platform.
- Proficiency in SQL, Python, and Cypher query language.
- Strong hands-on experience with BigQuery, Dataflow, Pub/Sub, and Cloud Storage.
- Knowledge of graph theory, graph schema modeling, and data relationship mapping.
- Fluent in English.
- Certification: Google Professional Data Engineer or equivalent (e.g., AWS Big Data,Azure Data Engineer).
Thanks & Regards,
Sudhakaran IT Recruiter | Europe & UK
Email - sudhakaran.m@ampstek.com
Tel - +44(20)45150009
Ampstek Services Limited