Google Cloud Platform Data Architect - Remote - Contract

Google Cloud Platform Data Architect - Remote - Contract

Posted 1 day ago by 1763545183

Negotiable
Outside
Remote
USA

Summary: The Google Cloud Platform Data Architect role focuses on designing, implementing, and managing data infrastructure with an emphasis on Google Cloud's Vertex AI capabilities. This position aims to enhance search experiences and AI applications to support business growth. The architect will collaborate with teams to develop data solutions that integrate advanced AI and machine learning functionalities. The role is remote, allowing flexibility in work arrangements.

Key Responsibilities:

  • Architecture Design: Design and architect scalable, secure, and cost-effective data solutions on Google Cloud Platform that support both traditional analytics and advanced AI/ML workloads.
  • Vertex AI Search Implementation: Lead the design and implementation of enterprise-grade search experiences using Vertex AI Search across websites, intranets, and RAG systems for generative AI applications.
  • Data Pipeline Development: Design and build robust, end-to-end data pipelines that feed high-quality data to the Vertex AI systems using services like BigQuery, Cloud Storage, Dataflow, and Pub/Sub.
  • Generative AI Integration: Collaborate with data science and engineering teams to translate business requirements into AI-based solutions, including building and deploying generative AI models within the Vertex AI framework.
  • Vector Search Expertise: Implement vector search and embeddings for semantic search and recommendation systems, organizing data by meaning to provide highly relevant results in milliseconds.
  • Data Governance & Security: Establish data policies, standards, and security protocols to ensure data accuracy, accessibility, security, and compliance with industry regulations within the Google Cloud Platform environment.
  • Performance Optimization: Monitor and optimize the performance of data infrastructure and AI search systems, troubleshooting complex technical issues as they arise.
  • Technical Leadership & Collaboration: Provide technical leadership and documentation to cross-functional teams and stakeholders, ensuring alignment between data strategies and organizational goals.

Key Skills:

  • 10+ years of industry experience designing and managing data solutions.
  • Proven experience with Google Cloud Platform services (BigQuery, Dataflow, Cloud Storage, etc.).
  • Hands-on experience with Vertex AI, specifically implementing Vertex AI Search and Vector Search.
  • Experience designing and building data architectures that support AI and ML applications.
  • Proficiency in programming languages such as Python, SQL, and Java.
  • Deep understanding of data modeling, data warehousing, data lakes, and ETL processes.
  • Familiarity with ML operations (MLOps) practices and tools for model lifecycle management.
  • Knowledge of networking, cybersecurity fundamentals, and compliance standards.

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Role: Google Cloud Platform Data Architect

Remote

Detailed JD:

We are seeking a highly skilled Google Cloud Platform Data Architect to design, implement, and manage our organization's data infrastructure, with a specific focus on leveraging Google Cloud's Vertex AI Search and Agent Builder capabilities. This role will bridge the gap between data management practices and cutting-edge AI applications, enabling enhanced search experiences, RAG systems, and AI-powered recommendations to drive business growth.

Experience:

    • 10+ years of industry experience designing and managing data solutions.
    • Proven experience with Google Cloud Platform (Google Cloud Platform) services (BigQuery, Dataflow, Cloud Storage, etc.).
    • Hands-on experience with Vertex AI, specifically implementing Vertex AI Search and Vector Search.
    • Experience designing and building data architectures that support AI and ML applications.
    • Proficiency in programming languages such as Python, SQL, and Java.

Key Responsibilities:

  • Architecture Design: Design and architect scalable, secure, and cost-effective data solutions on Google Cloud Platform (Google Cloud Platform) that support both traditional analytics and advanced AI/ML workloads.
  • Vertex AI Search Implementation: Lead the design and implementation of enterprise-grade search experiences using Vertex AI Search (part of Vertex AI Agent Builder) across websites, intranets, and RAG systems for generative AI applications.
  • Data Pipeline Development: Design and build robust, end-to-end data pipelines (ingestion, transformation, storage) that feed high-quality data to the Vertex AI systems using services like BigQuery, Cloud Storage, Dataflow, and Pub/Sub.
  • Generative AI Integration: Collaborate with data science and engineering teams to translate business requirements into AI-based solutions, including building and deploying generative AI models (e.g., for chatbots, content creation) within the Vertex AI framework.
  • Vector Search Expertise: Implement vector search and embeddings for semantic search and recommendation systems, organizing data by meaning to provide highly relevant results in milliseconds.
  • Data Governance & Security: Establish data policies, standards, and security protocols to ensure data accuracy, accessibility, security, and compliance with industry regulations within the Google Cloud Platform environment.
  • Performance Optimization: Monitor and optimize the performance of data infrastructure and AI search systems, troubleshooting complex technical issues as they arise.
  • Technical Leadership & Collaboration: Provide technical leadership and documentation (e.g., architecture diagrams, data models) to cross-functional teams and stakeholders, ensuring alignment between data strategies and organizational goals.

Technical Knowledge:

    • Deep understanding of data modeling, data warehousing, data lakes, and ETL processes.
    • Familiarity with ML operations (MLOps) practices and tools for model lifecycle management (training, evaluation, deployment, monitoring).
    • Knowledge of networking, cybersecurity fundamentals, and compliance standards.