Negotiable
Outside
Remote
USA
Summary: The ETL Developer role involves designing, developing, and maintaining ETL processes to manage data from various sources into data warehouses or lakes. The position requires collaboration with stakeholders to gather requirements and ensure data quality and integrity. The developer will also optimize workflows, troubleshoot issues, and implement data governance standards. This role is primarily remote but may also be based in Texas City, Texas.
Key Responsibilities:
- Design, develop, and maintain ETL processes for extracting, transforming, and loading data from multiple sources into data warehouses or data lakes.
- Collaborate with business analysts, data architects, and stakeholders to gather requirements and translate them into ETL solutions.
- Optimize ETL workflows for performance, scalability, and reliability.
- Ensure data quality, consistency, and integrity across multiple systems and environments.
- Develop reusable ETL components and maintain technical documentation.
- Monitor, troubleshoot, and resolve ETL job failures, performance issues, and data discrepancies.
- Support data integration, migration, and modernization initiatives.
- Implement data security, compliance, and governance standards within ETL pipelines.
- Participate in Agile ceremonies including sprint planning, daily standups, and retrospectives.
- Stay current with emerging ETL tools, cloud-based platforms, and best practices.
Key Skills:
- 10+ years of experience as an ETL Developer in enterprise environments.
- Proficient in ETL tools such as Informatica, Talend, DataStage, SSIS, or Pentaho.
- Strong SQL skills for data analysis, transformations, and query optimization.
- Solid understanding of relational databases (Oracle, SQL Server, PostgreSQL) and data modeling concepts.
- Experience with data warehousing platforms (Snowflake, Redshift, BigQuery, Synapse).
- Knowledge of scripting languages (Python, Shell, or Perl) for automation.
- Proficiency in debugging, performance tuning, and root cause analysis of ETL processes.
- Familiarity with version control (Git) and CI/CD for ETL deployments.
- Experience with cloud platforms (AWS, Azure, or Google Cloud Platform) and their data services.
- Understanding of data governance, lineage, and compliance practices.
Salary (Rate): undetermined
City: Texas City
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Position: ETL Developer
Contract: W2 Only
Responsibilities
Design, develop, and maintain ETL processes for extracting, transforming, and loading data from multiple sources into data warehouses or data lakes.
Collaborate with business analysts, data architects, and stakeholders to gather requirements and translate them into ETL solutions.
Optimize ETL workflows for performance, scalability, and reliability.
Ensure data quality, consistency, and integrity across multiple systems and environments.
Develop reusable ETL components and maintain technical documentation.
Monitor, troubleshoot, and resolve ETL job failures, performance issues, and data discrepancies.
Support data integration, migration, and modernization initiatives.
Implement data security, compliance, and governance standards within ETL pipelines.
Participate in Agile ceremonies including sprint planning, daily standups, and retrospectives.
Stay current with emerging ETL tools, cloud-based platforms, and best practices.
Required Skills
10+ years of experience as an ETL Developer in enterprise environments.
Proficient in ETL tools such as Informatica, Talend, DataStage, SSIS, or Pentaho.
Strong SQL skills for data analysis, transformations, and query optimization.
Solid understanding of relational databases (Oracle, SQL Server, PostgreSQL) and data modeling concepts.
Experience with data warehousing platforms (Snowflake, Redshift, BigQuery, Synapse).
Knowledge of scripting languages (Python, Shell, or Perl) for automation.
Proficiency in debugging, performance tuning, and root cause analysis of ETL processes.
Familiarity with version control (Git) and CI/CD for ETL deployments.
Experience with cloud platforms (AWS, Azure, or Google Cloud Platform) and their data services.
Understanding of data governance, lineage, and compliance practices.
Nice-to-Have
Experience with real-time/streaming ETL using Kafka, Kinesis, or Spark Streaming.
Exposure to big data ecosystems (Hadoop, Hive, Spark).
Familiarity with data catalog and metadata management tools.
Experience with API integration and web services.
Knowledge of DevOps tools for deployment automation (Jenkins, Ansible).
Exposure to Master Data Management (MDM) solutions.
Soft Skills
Strong analytical and problem-solving skills with attention to detail.
Excellent communication and collaboration skills with business and technical teams.
Ability to work independently and manage multiple priorities.
Detail-oriented mindset with focus on delivering high-quality, reliable ETL solutions.
Proactive learner with a passion for continuous improvement in data engineering practices.