Data Engineer with data warehouses& data lakes & and ETL

Data Engineer with data warehouses& data lakes & and ETL

Posted 1 day ago by 1765258443

Negotiable
Outside
Remote
USA

Summary: The role of AWS Data Engineer requires a seasoned professional with 6-8 years of experience in data warehouses, data lakes, and ETL pipelines. The candidate will be responsible for building optimized data pipelines, orchestrating workflows, and implementing CI/CD processes, primarily using tools like Snowflake, Apache Airflow, and AWS services. Strong problem-solving skills and the ability to communicate technical information effectively are essential for collaborating with cross-functional teams. The position is remote and classified as outside IR35.

Key Responsibilities:

  • Build optimized data pipelines using Snowflake and dbt.
  • Orchestrate data pipelines using Apache Airflow, including authoring, scheduling, and monitoring workflows.
  • Utilize AWS cloud services such as EKS, ECS, S3, RDS, and IAM.
  • Design and implement CI/CD workflows using GitHub Actions, Codeship, Jenkins, etc.
  • Work with tools like Terraform, Docker, and Kafka.
  • Develop strong experience with Spark using Scala and Python.
  • Execute advanced SQL queries and work with various relational databases like Redshift and Postgres.
  • Architect scalable data platforms and applications for large enterprise clients.
  • Focus on building high-performance systems and data quality frameworks.
  • Identify and resolve data engineering issues and system failures.
  • Communicate technical information to non-technical stakeholders and collaborate with cross-functional teams.
  • Envision and construct scalable solutions for enterprise clients.

Key Skills:

  • 6-8 years of experience in data engineering, data warehouses, and data lakes.
  • Proven experience with Snowflake and dbt.
  • Expertise in Apache Airflow for orchestrating data pipelines.
  • Proficiency in AWS services (EKS, ECS, S3, RDS, IAM).
  • Experience with CI/CD workflows (GitHub Actions, Codeship, Jenkins).
  • Familiarity with Terraform, Docker, and Kafka.
  • Strong experience with Spark using Scala and Python.
  • Advanced SQL knowledge and experience with relational databases.
  • Experience in data modeling and system design.
  • Strong problem-solving and troubleshooting skills.
  • Excellent communication skills for cross-functional collaboration.
  • Ability to construct scalable solutions for enterprise clients.

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:
AWS Data Engineer
Remote
Candidate has to take glider test
At least 6-8 years of experience in working with data warehouses, data lakes, and ETL pipelines
Proven experience with building optimized data pipelines using Snowflake and dbt
Expert in orchestrating data pipelines using Apache Airflow, including authoring,scheduling, and monitoring workflows
Exposure to AWS and proficiency in cloud services such as EKS(Kubernetes), ECS, S3,RDS, IAM etc.
Experience designing and implementing CI/CD workflows using GitHub Actions, Codeship, Jenkins etc.
Experience with tools like Terraform, Docker, Kafka
Strong experience with Spark using Scala and Python
Advanced SQL knowledge, with experience in pulling complex queries, query authoring, and strong familiarity with Snowflake and various relational databases like Redshift, Postgres, etc.
Experience with data modeling and system design architecting scalable data platforms and applications for large enterprise clients.
A dedicated focus on building high-performance systems
Exposure to building data quality frameworks
Strong problem-solving and troubleshooting skills, with the ability to identify and resolve data engineering issues and system failures
Excellent communication skills, with the ability to communicate technical information to non-technical stakeholders and collaborate effectively with cross-functional teams
The ability to envision and construct scalable solutions that meet diverse needs for enterprise clients with dedicated data teams