£480 Per day
Undetermined
Undetermined
Leeds, England, United Kingdom
Summary: The AI Architect role focuses on multi-cloud AI architecture design across various platforms, including IDP, GCP, and Azure. The position requires expertise in performance and cost optimization for AI workloads, particularly in GCP, and involves defining MLOps standards for scalable deployment. The architect will also address cross-platform design constraints related to connectivity, security, and data access. Overall, the role emphasizes creating viable production options for diverse AI use cases.
Key Responsibilities:
- Design multi-cloud AI architecture across IDP, GCP, Azure, and other platforms.
- Articulate production options and trade-offs for AI and GenAI use cases.
- Implement cross-platform constraint-aware design considering connectivity, security, and data access.
- Optimize performance and costs for GCP-hosted AI and GenAI workloads.
- Define and embed end-to-end MLOps standards for deployment at scale.
Key Skills:
- Experience in multi-cloud AI architecture design.
- Expertise in GCP, Azure, and hybrid integration patterns.
- Knowledge of performance and cost optimization for AI workloads.
- Understanding of MLOps standards and CI/CD processes.
- Ability to address cross-platform design constraints.
Salary (Rate): £480.00/daily
City: Leeds
Country: United Kingdom
Working Arrangements: undetermined
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
Location: Leeds/Halifax
Contract Jd
Multi cloud AI architecture design across IDP, GCP, azure and other platforms including hybrid future ready integration patterns
Clearly articulating viable production options and trade offs for different AI and GenAI use cases in platform
Experience in designing and right AI solutions in the platform
Cross platform constraint aware design- accounting for connectivity and security and data access restrictions in platform
Performance and cost optimization expertise : particularly GCP hosted AI and GENAI workloads balancing scalability, latency and run cost efficiency
MLOps and AI production station support : defining and embedding end to end MLOps standards (CI/CD versioning observability automated retraining monitoring standards )
To enable safe and reliable and repeatable deployment at scale