Actively looking for a AWS Data Engineer

Actively looking for a AWS Data Engineer

Posted Today by Arbor Tek Systems

Negotiable
Undetermined
Remote
Remote

Summary: The role of AWS Data Engineer requires an experienced professional with over 12 years in the field to work on a long-term remote contract. The position focuses on building and maintaining scalable data pipelines and implementing CI/CD processes. The engineer will collaborate with architects and data teams to ensure the delivery of robust data products. Key responsibilities include developing ingestion and transformation pipelines using various AWS services.

Key Responsibilities:

  • Build and maintain scalable, secure data pipelines
  • Develop ingestion and transformation pipelines using AWS Glue, Lambda, Redshift
  • Implement CI/CD pipelines for automated build, test, and deployment
  • Design automated testing frameworks for data quality and performance
  • Set up monitoring and observability using CloudWatch and custom metrics
  • Collaborate with architects and data teams to deliver robust data products

Key Skills:

  • Python, SQL, Shell Scripting
  • AWS (Glue, Redshift, S3, Lambda, CloudWatch)
  • Terraform, Jenkins, AWS CodePipeline
  • ETL/ELT and Data Integration expertise
  • Strong problem-solving and communication skills

Salary (Rate): undetermined

City: undetermined

Country: undetermined

Working Arrangements: remote

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Role: AWS Data Engineer
Location: Remote We are looking for an experienced AWS Data Engineer with 12+ years of experience for a long-term remote contract opportunity.

Role Highlights:
Build and maintain scalable, secure data pipelines
Develop ingestion and transformation pipelines using AWS Glue, Lambda, Redshift
Implement CI/CD pipelines for automated build, test, and deployment
Design automated testing frameworks for data quality and performance
Set up monitoring and observability using CloudWatch and custom metrics
Collaborate with architects and data teams to deliver robust data products
Required Skills:
Python, SQL, Shell Scripting
AWS (Glue, Redshift, S3, Lambda, CloudWatch)
Terraform, Jenkins, AWS CodePipeline
ETL/ELT and Data Integration expertise
Strong problem-solving and communication skills
If you re interested or know someone who d be a great fit, send your resume