
Senior DevOps Engineer/Python/GitHub/DevOps/AWS terraform.
Posted 1 week ago by Thrive IT Systems Ltd
Negotiable
Fixed-Term
Hybrid
Milton Keynes, UK
Summary: The Senior DevOps Engineer role focuses on leveraging Python, GitHub, and AWS Terraform to manage and automate cloud resources. The position requires expertise in writing and maintaining Python code, deploying infrastructure as code, and integrating various AWS services. The role is based in Milton Keynes and involves a mix of on-site and remote work. It is a 12-month fixed-term contract.
Key Responsibilities:
- Good experience and Expertise of Python along with GitHub & DevOps
- Must have experience of Writing, testing, and maintaining Python code.
- Deploy and manage cloud resources using Terraform following Infrastructure as Code (IaC) best practices followed in Project Max AWS account
- Automate provisioning of infrastructure across environments (dev, staging, prod) using existing terraform pipeline and new
- Existing VPC and Subnet will be used .Github repo already created
- Build the lambda authorizer approved infra with NLB,API Gateway,Lamda and S3 for authorizer configuration. The authorizer configuration will have a lamda.zip file already available and provided by the group. The implementation will follow the similar PoC path ([Identity-Preprod] APIGateway - Lambda POC - Container Service Infrastructure (CSI) - Confluence). Hasicorp will be used instead of secret manager. Haiscorp integration with Project Max already exists.
- AWS Ping will be used for authorization. Integration with Project Max is already present.
- A lambda service to be created to provide the Pre signed Url for the S3 (both request and response). Using those presigned url the documents will be uploaded and downloaded by Uipath.
- A lambda service(Doc pre-processing ) should be triggered once document are placed in request 3 and the lambda should split the doc files into pages and convert into base 64. And trigger a step function.
- The first lambda (OCR)in the trigger function will call the document processing API in GenAI hub with access token fetched from Hasicorp vault. The base64 version of the doc will be passed in the payload and the service will be called in a loop until all pages are processed. The response will be provided the GenAI Hub API in Json format. The lambda will co related all the responses
- Then a Lambda service (Doc Validation) will use the response from previous call and send a json request to call FM (Foundation model)API in GenAI Hub to use GPT for validation. The prompt and json will part of the payload. Based on the response in ison it will store the response in the response S3 based on the agreed format.
- If doc validation is successful, the next lambda(Budget Planner) will create a pre-defined Excel based on some custom calculation and the response from earlier responses (OCR and Doc Validation).It will also store in S3 response and it will also send notification via SNS to inform Uipath to download the excel and response file.
- Any failure in any step of the step function must be informed to Uipath via SNS.
- The meta data of each file processing should be store in the RDS.
- Set up and manage VPC Interface Endpoints (AWS PrivateLink) to securely connect to AWS services from within VPC.S3, RDS, SNS will be accessed via VPC endpoints. Ensure secure access to S3 using IAM, bucket policies, and VPC endpoints. We can use existing VPC endpoints also.
Key Skills:
- Expertise in Python programming
- Experience with GitHub and DevOps practices
- Proficiency in AWS services, particularly Terraform
- Knowledge of Infrastructure as Code (IaC) best practices
- Experience with AWS Lambda, API Gateway, and S3
- Familiarity with VPC, RDS, and IAM policies
- Ability to automate infrastructure provisioning
- Experience with document processing and integration with GenAI hub
Salary (Rate): undetermined
City: Milton Keynes
Country: UK
Working Arrangements: hybrid
IR35 Status: fixed-term
Seniority Level: Senior
Industry: IT
Position: Senior DevOps Engineer
Location: Milton Keynes - UK, 2 - 3 Days onsite
Duration: 12 Months Fix Term Contract
Key Responsibilities:
Key Search: Python along with GitHub & DevOps, AWS terraform.
- Good experience and Expertise of Python along with GitHub & DevOps
- Must have experience of Writing, testing, and maintaining Python code.
- Deploy and manage cloud resources using Terraform following Infrastructure as Code (IaC) best practices followed in Project Max AWS account
- Automate provisioning of infrastructure across environments (dev, staging, prod) using existing terraform pipeline and new
- Existing VPC and Subnet will be used .Github repo already created
- Build the lambda authorizer approved infra with NLB,API Gateway,Lamda and S3 for authorizer configuration. The authorizer configuration will have a lamda.zip file already available and provided by the group. The implementation will follow the similar PoC path ([Identity-Preprod] APIGateway - Lambda POC - Container Service Infrastructure (CSI) - Confluence). Hasicorp will be used instead of secret manager. Haiscorp integration with Project Max already exists.
- AWS Ping will be used for authorization. Integration with Project Max is already present.
- A lambda service to be created to provide the Pre signed Url for the S3 (both request and response). Using those presigned url the documents will be uploaded and downloaded by Uipath.
- A lambda service(Doc pre-processing ) should be triggered once document are placed in request 3 and the lambda should split the doc files into pages and convert into base 64. And trigger a step function.
- The first lambda (OCR)in the trigger function will call the document processing API in GenAI hub with access token fetched from Hasicorp vault. The base64 version of the doc will be passed in the payload and the service will be called in a loop until all pages are processed. The response will be provided the GenAI Hub API in Json format. The lambda will co related all the responses
- Then a Lambda service (Doc Validation) will use the response from previous call and send a json request to call FM (Foundation model)API in GenAI Hub to use GPT for validation. The prompt and json will part of the payload. Based on the response in ison it will store the response in the response S3 based on the agreed format.
- If doc validation is successful, the next lambda(Budget Planner) will create a pre-defined Excel based on some custom calculation and the response from earlier responses (OCR and Doc Validation).It will also store in S3 response and it will also send notification via SNS to inform Uipath to download the excel and response file.
- Any failure in any step of the step function must be informed to Uipath via SNS.
- The meta data of each file processing should be store in the RDS.
- Set up and manage VPC Interface Endpoints (AWS PrivateLink) to securely connect to AWS services from within VPC.S3, RDS, SNS will be accessed via VPC endpoints. Ensure secure access to S3 using IAM, bucket policies, and VPC endpoints. We can use existing VPC endpoints also.
