DevOps Engineer

DevOps Engineer

Posted 1 week ago by IBU Consulting

Negotiable
Undetermined
Hybrid
Milton Keynes, England, United Kingdom

Summary: The role of DevOps Engineer focuses on leveraging Python, GitHub, and DevOps practices to manage and deploy cloud resources using Terraform within a banking domain. The position requires expertise in writing, testing, and maintaining Python code, as well as automating infrastructure provisioning across various environments. The engineer will also be responsible for integrating various AWS services and ensuring secure access to resources. This is a 12-month fixed-term position based in Milton Keynes, UK, with a hybrid working arrangement.

Key Responsibilities:

  • Write, test, and maintain Python code for cloud resource management.
  • Deploy and manage cloud resources using Terraform following Infrastructure as Code (IaC) best practices.
  • Automate provisioning of infrastructure across development, staging, and production environments.
  • Build and configure AWS services including Lambda, API Gateway, and S3 for document processing.
  • Integrate with existing systems and services, ensuring secure access and data handling.
  • Manage VPC Interface Endpoints for secure connections to AWS services.
  • Store metadata of file processing in RDS and handle notifications via SNS.

Key Skills:

  • Proficient in Python programming.
  • Experience with GitHub and DevOps methodologies.
  • Strong knowledge of AWS services, including Lambda, S3, RDS, and VPC.
  • Expertise in Terraform for infrastructure management.
  • Understanding of Infrastructure as Code (IaC) best practices.
  • Experience with document processing and automation tools.
  • Ability to manage and secure cloud resources effectively.

Salary (Rate): undetermined

City: Milton Keynes

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Client Domain- Banking. Employment Type-12 Months Fixed Term Job Location- Milton Keynes, UK. Working Arrangements- 3 Days Work from Office (Hybrid). Start Date- ASAP. Job Description Key Search: Python along with GitHub & DevOps, AWS terraform. Good experience and Expertise of Python along with GitHub & DevOps Must have experience of Writing, testing, and maintaining Python code. Deploy and manage cloud resources using Terraform following Infrastructure as Code (IaC) best practices followed in Project Max AWS account Automate provisioning of infrastructure across environments (dev, staging, prod) using existing terraform pipeline and new Existing VPC and Subnet will be used .Github repo already created Build the lambda authorizer approved infra with NLB,API Gateway,Lamda and S3 for authorizer configuration. The authorizer configuration will have a lamda.zip file already available and provided by the group. The implementation will follow the similar PoC path ([Identity-Preprod] APIGateway - Lambda POC - Container Service Infrastructure (CSI) - Confluence). Hasicorp will be used instead of secret manager. Haiscorp integration with Project Max already exists. AWS Ping will be used for authorization. Integration with Project Max is already present. A lambda service to be created to provide the Pre signed Url for the S3 (both request and response). Using those presigned url the documents will be uploaded and downloaded by Uipath. A lambda service(Doc pre-processing ) should be triggered once document are placed in request 3 and the lambda should split the doc files into pages and convert into base 64. And trigger a step function. The first lambda (OCR)in the trigger function will call the document processing API in GenAI hub with access token fetched from Hasicorp vault. The base64 version of the doc will be passed in the payload and the service will be called in a loop until all pages are processed. The response will be provided the GenAI Hub API in Json format. The lambda will co related all the responses Then a Lambda service (Doc Validation) will use the response from previous call and send a json request to call FM (Foundation model)API in GenAI Hub to use GPT for validation. The prompt and json will part of the payload. Based on the response in ison it will store the response in the response S3 based on the agreed format. If doc validation is successful, the next lambda(Budget Planner) will create a pre-defined Excel based on some custom calculation and the response from earlier responses (OCR and Doc Validation).It will also store in S3 response and it will also send notification via SNS to inform Uipath to download the excel and response file. Any failure in any step of the step function must be informed to Uipath via SNS. The meta data of each file processing should be store in the RDS. Set up and manage VPC Interface Endpoints (AWS PrivateLink) to securely connect to AWS services from within VPC.S3, RDS, SNS will be accessed via VPC endpoints. Ensure secure access to S3 using IAM, bucket policies, and VPC endpoints. We can use existing VPC endpoints also.