Negotiable
Undetermined
Remote
United Kingdom
Summary: The role of Sr Cloud Data Architect involves leading large-scale data migration programs and designing automated data pipelines using Google Cloud's advanced data services. The ideal candidate will engage deeply with enterprise customers to drive transformation and ensure data quality and governance. This position requires extensive experience in data architecture and engineering, particularly within cloud environments. The role is remote and offers a contract duration of over 10 months.
Key Responsibilities:
- Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives.
- Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable.
- Define and implement real-time and batch processing pipelines for complex use cases involving streaming analytics, ML feature engineering, and automation.
- Act as a trusted advisor to senior technical and business stakeholders across industries (e.g., telecom, retail, financial services).
- Drive data quality, governance, lineage, and security standards across enterprise data pipelines.
- Mentor engineering teams and lead best practice adoption across data architecture, orchestration, and DevOps tooling.
- Participate in technical workshops, executive briefings, and architecture reviews to evangelize GCP data capabilities.
Key Skills:
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related technical field.
- 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs.
- 8+ years of hands-on experience as a Data Engineer, with at least 3+ years specifically working with Google Cloud Platform (GCP) data services.
- Strong proficiency in SQL and experience with schema design and query optimization for large datasets.
- Expertise in BigQuery, including advanced SQL, partitioning, clustering, and performance tuning.
- Hands-on experience with at least one of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow).
- Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development.
- Understanding of data warehousing and data lake concepts and best practices.
- Experience with version control systems (e.g., Git).
- 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable.
- Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer.
- Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies.
- Track record of success advising C-level executives and aligning technical solutions with business goals.
- Google Professional Data Engineer certification (Mandatory).
- Google Professional Cloud Architect certification or equivalent.
Salary (Rate): undetermined
City: undetermined
Country: United Kingdom
Working Arrangements: remote
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
Greeting from iXceed Solutions. I have a job Opening for Sr Cloud Data Architect at UK REMOTE. If you are interested in this role, please share me your updated resume.
Position: Sr Cloud Data Architect
Location: UK REMOTE
Duration: 10+ months contract
Job Description
About the Role
Client is looking for a visionary and technically proficient Senior Cloud Data Architect. The ideal candidate will have extensive experience leading large-scale data migration programs and designing automated, production-grade data pipelines across various industries. This strategic role involves deep architectural engagement with enterprise customers, driving transformation through Google Cloud's advanced data services such as Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable.
Key Responsibilities
- Provide technical leadership and strategic direction for enterprise-scale data migration and modernization initiatives.
- Architect end-to-end data platforms using Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable.
- Define and implement real-time and batch processing pipelines for complex use cases involving streaming analytics, ML feature engineering, and automation.
- Act as a trusted advisor to senior technical and business stakeholders across industries (e.g., telecom, retail, financial services).
- Drive data quality, governance, lineage, and security standards across enterprise data pipelines.
- Mentor engineering teams and lead best practice adoption across data architecture, orchestration, and DevOps tooling.
- Participate in technical workshops, executive briefings, and architecture reviews to evangelize GCP data capabilities.
Required Qualifications
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related technical field.
- 12+ years of experience in data architecture and data engineering with proven skills and leadership in large-scale cloud data programs.
- 8+ years of hands-on experience as a Data Engineer, with at least 3+ years specifically working with Google Cloud Platform (GCP) data services.
- Strong proficiency in SQL and experience with schema design and query optimization for large datasets.
- Expertise in BigQuery, including advanced SQL, partitioning, clustering, and performance tuning.
- Hands-on experience with at least one of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow).
- Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development.
- Understanding of data warehousing and data lake concepts and best practices.
- Experience with version control systems (e.g., Git).
- 5+ years of advanced expertise in Google Cloud data services: Dataproc, Dataflow, Pub/Sub, BigQuery, Cloud Spanner, and Bigtable.
- Hands-on experience with orchestration tools like Apache Airflow or Cloud Composer.
- Hands-on experience with one or more of the following GCP data processing services: Dataflow (Apache Beam), Dataproc (Apache Spark/Hadoop), or Composer (Apache Airflow).
- Proficiency in at least one scripting/programming language (e.g., Python, Java, Scala) for data manipulation and pipeline development. Scala is mandated in some cases.
- Deep understanding of data lakehouse design, event-driven architecture, and hybrid cloud data strategies.
- Strong proficiency in SQL and experience with schema design and query optimization for large datasets.
- Expertise in BigQuery, including advanced SQL, partitioning, clustering, and performance tuning.
- Experience with version control systems (e.g., Git).
- Track record of success advising C-level executives and aligning technical solutions with business goals.
- Google Professional Data Engineer certification (Mandatory).
- Google Professional Cloud Architect certification or equivalent.
Preferred Qualifications
- Experience with modernization of on-premise and mainframe data environments into cloud-native architectures.
- Knowledge of regulatory data requirements across financial, healthcare, and telecom sectors.
- Familiarity with IaC (Terraform), GitOps, and CI/CD for data pipeline deployment.
- Strong communication, stakeholder management, and mentoring skills.
Thanks and Regards,
Ajay Thakur
Account Manager
Direct : +44 2080950378
Email : ajayt@ixceed-solutions.com
https://www.linkedin.com/in/a-j-thakur/