Snowflake Data Engineer

Snowflake Data Engineer

Posted 4 days ago by 1756535697

Negotiable
Outside
Remote
USA

Summary: We are looking for a Senior Data Engineer specializing in Snowflake, DBT, and Airflow to develop and optimize cloud data platforms. The role involves designing ETL/ELT pipelines, managing data across various layers, and ensuring compliance with data governance standards. The ideal candidate will have extensive experience in data engineering and a strong understanding of data modeling and orchestration frameworks. This position is fully remote and requires a minimum of three years of experience in the US.

Key Responsibilities:

  • Design and implement end-to-end data pipelines using Snowflake, DBT, Airflow, and Python.
  • Ingest data from diverse sources (transaction systems, APIs, streaming platforms like Kafka) into cloud data platforms (AWS, Azure, Google Cloud Platform).
  • Manage data across bronze, silver, and gold layers ensuring scalability, lineage, and compliance.
  • Develop and optimize data models (star schema, Data Vault, SCD handling) for analytics and reporting use cases.
  • Build reusable frameworks for data ingestion, transformation, and quality checks.
  • Define and manage Airflow DAGs with dependencies, retries, SLAs, and monitoring.
  • Implement CI/CD pipelines for DBT and Airflow jobs using GitLab/Jenkins.
  • Ensure data governance, masking, encryption, and regulatory compliance (HIPAA, GDPR, SOX, PCI DSS).
  • Collaborate with cross-functional teams (Data Architects, BI Developers, Data Scientists) to deliver business-ready datasets.
  • Troubleshoot pipeline failures, optimize query performance, and ensure 99.9% pipeline reliability.

Key Skills:

  • 6+ years of professional experience in Data Engineering.
  • Hands-on expertise with Snowflake (warehousing, task automation, RBAC, query optimization).
  • Proficiency in DBT (models, macros, testing, documentation, dbt Cloud/CLI).
  • Strong experience with Apache Airflow (DAG design, custom operators, retries, SLAs, alerting).
  • Advanced SQL and Python programming skills.
  • Experience with AWS services (S3, Glue, Lambda, EMR, RDS, Kinesis, MSK).
  • Familiarity with streaming technologies (Kafka, Spark Streaming).
  • Strong understanding of data modeling, ETL/ELT frameworks, and pipeline orchestration.
  • Knowledge of DevOps practices (CI/CD, Terraform, Docker, GitLab/Jenkins).
  • Excellent problem-solving and communication skills; ability to explain technical solutions in business terms.

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: Senior

Industry: IT

Detailed Description From Employer:

Title : Snowflake Data Engineer

Type : Remote

Visa : , ,, OPT

Primary Skills : Snowflake , DBT , Python & Airflow

Interview : 2-3 rounds of Interviews

We are seeking a Senior Data Engineer with strong expertise in Snowflake, DBT, and Airflow to design, build, and optimize modern cloud data platforms. The ideal candidate has hands-on experience with ETL/ELT pipelines, data modeling, and orchestration frameworks, and can translate business requirements into scalable, secure, and performant data solutions.

Key Responsibilities:

  • Design and implement end-to-end data pipelines using Snowflake, DBT, Airflow, and Python.
  • Ingest data from diverse sources (transaction systems, APIs, streaming platforms like Kafka) into cloud data platforms (AWS, Azure, Google Cloud Platform).
  • Manage data across bronze, silver, and gold layers ensuring scalability, lineage, and compliance.
  • Develop and optimize data models (star schema, Data Vault, SCD handling) for analytics and reporting use cases.
  • Build reusable frameworks for data ingestion, transformation, and quality checks.
  • Define and manage Airflow DAGs with dependencies, retries, SLAs, and monitoring.
  • Implement CI/CD pipelines for DBT and Airflow jobs using GitLab/Jenkins.
  • Ensure data governance, masking, encryption, and regulatory compliance (HIPAA, GDPR, SOX, PCI DSS).
  • Collaborate with cross-functional teams (Data Architects, BI Developers, Data Scientists) to deliver business-ready datasets.
  • Troubleshoot pipeline failures, optimize query performance, and ensure 99.9% pipeline reliability.

Required Skills & Experience:

  • 6+ years of professional experience in Data Engineering.
  • Hands-on expertise with Snowflake (warehousing, task automation, RBAC, query optimization).
  • Proficiency in DBT (models, macros, testing, documentation, dbt Cloud/CLI).
  • Strong experience with Apache Airflow (DAG design, custom operators, retries, SLAs, alerting).
  • Advanced SQL and Python programming skills.
  • Experience with AWS services (S3, Glue, Lambda, EMR, RDS, Kinesis, MSK).
  • Familiarity with streaming technologies (Kafka, Spark Streaming).
  • Strong understanding of data modeling, ETL/ELT frameworks, and pipeline orchestration.
  • Knowledge of DevOps practices (CI/CD, Terraform, Docker, GitLab/Jenkins).
  • Excellent problem-solving and communication skills; ability to explain technical solutions in business terms.

Preferred Qualifications

  • Prior experience in Healthcare, Finance, or Retail domains.
  • Familiarity with Redshift, Databricks, or BigQuery.
  • Exposure to data governance tools (Collibra, Alation).
  • Experience with real-time analytics use cases (fraud detection, patient monitoring, stockouts reduction).
  • Certification in Snowflake, AWS, or DBT.

Note - Minimum 3 years US experience

If interested , please share your resume at

#Snowflake #DBT #Python #AWS #W2 #Immedidate_Join #Urgent