ETL Developer

ETL Developer

Posted 1 day ago by 1756534381

Negotiable
Outside
Remote
USA

Summary: The ETL Developer role involves designing, developing, and maintaining ETL pipelines to facilitate data integration from various sources into enterprise data warehouses. The position requires collaboration with stakeholders to gather requirements and ensure data quality, while also optimizing data workflows using various ETL tools. Candidates should have extensive experience in ETL development and a strong understanding of data warehousing concepts. The role is primarily remote but may also be based in New York, New York.

Key Responsibilities:

  • Design, develop, and maintain ETL pipelines to extract, transform, and load data from diverse sources into enterprise data warehouses.
  • Collaborate with business stakeholders and data analysts to gather requirements and define data integration objectives.
  • Develop and optimize data workflows using tools like SSIS, Azure Data Factory, Databricks, or Informatica.
  • Implement data quality checks, validations, and transformation logic to ensure accuracy and consistency.
  • Build and maintain data models (star schema, snowflake, normalized) to support reporting and analytics needs.
  • Work with large-scale datasets across relational databases, cloud data warehouses, and big data platforms.
  • Monitor, troubleshoot, and resolve ETL failures, performance issues, and bottlenecks.
  • Apply version control (Git), CI/CD pipelines, and DevOps practices for ETL development and deployment.
  • Ensure adherence to data governance, security, and compliance standards in all ETL processes.
  • Collaborate with BI developers, data engineers, and architects to deliver scalable data integration solutions.

Key Skills:

  • 8 10+ years of experience in ETL development, data integration, or data engineering.
  • Strong expertise in ETL tools (SSIS, Azure Data Factory, Informatica, Databricks, or equivalent).
  • Proficiency in SQL and relational database systems (SQL Server, Oracle, PostgreSQL, etc.).
  • Solid understanding of data warehousing concepts (star schema, slowly changing dimensions, fact/dimension modeling).
  • Experience with performance tuning of ETL workflows and SQL queries.
  • Familiarity with cloud data platforms (Azure Synapse, AWS Redshift, Snowflake, Google BigQuery).
  • Knowledge of scheduling, monitoring, and automation tools for ETL jobs.
  • Experience with Git-based workflows, DevOps, and CI/CD automation.
  • Strong understanding of data quality, governance, and security best practices.

Salary (Rate): undetermined

City: New York

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Position: ETL Developer

Contract: W2 Only

Responsibilities

  • Design, develop, and maintain ETL pipelines to extract, transform, and load data from diverse sources into enterprise data warehouses.

  • Collaborate with business stakeholders and data analysts to gather requirements and define data integration objectives.

  • Develop and optimize data workflows using tools like SSIS, Azure Data Factory, Databricks, or Informatica.

  • Implement data quality checks, validations, and transformation logic to ensure accuracy and consistency.

  • Build and maintain data models (star schema, snowflake, normalized) to support reporting and analytics needs.

  • Work with large-scale datasets across relational databases, cloud data warehouses, and big data platforms.

  • Monitor, troubleshoot, and resolve ETL failures, performance issues, and bottlenecks.

  • Apply version control (Git), CI/CD pipelines, and DevOps practices for ETL development and deployment.

  • Ensure adherence to data governance, security, and compliance standards in all ETL processes.

  • Collaborate with BI developers, data engineers, and architects to deliver scalable data integration solutions.

Required Skills

  • 8 10+ years of experience in ETL development, data integration, or data engineering.

  • Strong expertise in ETL tools (SSIS, Azure Data Factory, Informatica, Databricks, or equivalent).

  • Proficiency in SQL and relational database systems (SQL Server, Oracle, PostgreSQL, etc.).

  • Solid understanding of data warehousing concepts (star schema, slowly changing dimensions, fact/dimension modeling).

  • Experience with performance tuning of ETL workflows and SQL queries.

  • Familiarity with cloud data platforms (Azure Synapse, AWS Redshift, Snowflake, Google BigQuery).

  • Knowledge of scheduling, monitoring, and automation tools for ETL jobs.

  • Experience with Git-based workflows, DevOps, and CI/CD automation.

  • Strong understanding of data quality, governance, and security best practices.

Nice-to-Have

  • Exposure to big data frameworks (Spark, Hadoop, Databricks).

  • Hands-on experience with Python, Scala, or shell scripting for ETL automation.

  • Knowledge of real-time/streaming ETL (Kafka, Event Hub, Kinesis).

  • Experience with containerization and orchestration (Docker, Kubernetes).

  • Familiarity with Terraform, Jenkins, GitHub Actions, or other automation tools.

Soft Skills

  • Strong analytical and problem-solving abilities with attention to detail.

  • Excellent communication skills to collaborate across technical and business teams.

  • Ability to work in Agile or hybrid Agile/Waterfall environments.

  • Self-motivated with a passion for continuous learning and improvement.

  • Commitment to data integrity, scalability, and reliability in all solutions.