Senior Data Engineer

Senior Data Engineer

Posted 2 days ago by 1761373327

Negotiable
Outside
Remote
USA

Summary: The Senior Data Engineer role involves developing and maintaining data ingestion pipelines, focusing on scalable infrastructure and data integrity. The position requires hands-on expertise in Snowflake, SQL, and Python, along with experience in AWS services and ETL/ELT processes. The role is remote with quarterly travel for PI planning and is classified as outside IR35. This is a 6-month contract-to-hire position.

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines using Python and SQL.
  • Maintain and optimize Snowflake data warehouse performance.
  • Extract data from APIs using Python and AWS Lambda.
  • Automate workflows with AWS Airflow.
  • Collaborate with data engineers and architects to develop or optimize pipelines.
  • Maintain code through CI/CD processes in Azure DevOps.

Key Skills:

  • Snowflake, SQL, Python
  • AWS services: Lambda, Airflow, Glue, S3, SNS
  • ETL/ELT processes and data ingestion pipelines
  • Experience with Fivetran & DBT is a plus
  • Strong problem-solving and communication skills

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Job Title : Senior Data Engineer

Location: Remote ( CST or EST ) w / quarterly travel for PI planning Duration: 6 - month contract-to-hire
Pay Rate - $65 per hour on W2 with no benefits
Need - / USC
We are looking for a hands - on Senior Data Engineer with expertise in developing data ingestion pipelines. This role should be crucial in designing, building, & maintaining our data infrastructure, focusing on creating scalable pipelines, ensuring data integrity, & optimizing performance.

Key skills must include strong Snowflake expertise, advanced SQL proficiency, data extraction from APIs using Python & AWS Lambda, & experience with ETL / ELT processes. Workflow automation using AWS Airflow is essential, & experience with Fivetran & DBT is a plus.

Key Responsibilities:

  • Design, build, and maintain scalable data pipelines using Python and SQL.

  • Maintain and optimize Snowflake data warehouse performance.

  • Extract data from APIs using Python and AWS Lambda.

  • Automate workflows with AWS Airflow.

  • Collaborate with data engineers and architects to develop or optimize pipelines.

  • Maintain code through CI/CD processes in Azure DevOps.

Key Skills:

  • Snowflake, SQL, Python

  • AWS services: Lambda, Airflow, Glue, S3, SNS

  • ETL/ELT processes and data ingestion pipelines

  • Experience with Fivetran & DBT is a plus

  • Strong problem-solving and communication skills

Qualifications:

  • 8+ years of experience in data engineering roles

  • Hands-on experience building and implementing scalable pipelines

  • Highly self-motivated and detail-oriented