ETL DataStage with Teradata

ETL DataStage with Teradata

Posted 5 days ago by Zuplon

Negotiable
Undetermined
Remote
Remote

Summary: We're seeking an experienced ETL DataStage Developer with a strong background in Teradata, Azure, Databricks, and Lakehouse technologies. The role involves designing, developing, and implementing data integration solutions while ensuring data quality and performance. The ideal candidate will collaborate with cross-functional teams to meet data requirements and optimize existing workflows. This position is fully remote and requires extensive experience in data integration technologies.

Key Responsibilities:

  • Design, develop, and implement ETL processes using Data Stage and Teradata tools (BTEQ, TPT, etc.)
  • Develop and maintain complex SQL queries, stored procedures, and macros in Teradata
  • Work with Azure Data Lake Storage (ADLS), Azure Synapse Analytics, and Databricks for data processing and storage
  • Implement data pipelines using Azure Data Factory, Databricks, and Lakehouse technologies
  • Collaborate with cross-functional teams to gather data requirements and implement solutions
  • Optimize and enhance existing ETL workflows for improved performance and reliability
  • Ensure data quality, integrity, and security

Key Skills:

  • 10+ years of experience as a Data Stage, Teradata Developer
  • Strong understanding of Teradata architecture and utilities (BTEQ, TPT, etc.)
  • Experience with Azure Data Lake Storage (ADLS), Azure Synapse Analytics, and Databricks
  • Proficiency in SQL, scripting languages (Unix, Python, etc.), and data modeling
  • Familiarity with Lakehouse architecture and technologies
  • Experience with Agile methodologies and version control systems (Git, etc.)

Salary (Rate): undetermined

City: undetermined

Country: undetermined

Working Arrangements: remote

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Job Title: ETL DataStage with Teradata

Location: Remote (CST) (W2 Only)

Job Summary:

We're looking for an experienced Data Stage, Teradata Developer with expertise in Azure, Databricks, and Lakehouse technologies to join our team. The ideal candidate will be responsible for designing, developing, and implementing data integration solutions using Teradata, Data Stage, Azure, Databricks, and Lakehouse.

Key Responsibilities:

- Design, develop, and implement ETL processes using Data Stage and Teradata tools (BTEQ, TPT, etc.)

- Develop and maintain complex SQL queries, stored procedures, and macros in Teradata

- Work with Azure Data Lake Storage (ADLS), Azure Synapse Analytics, and Databricks for data processing and storage

- Implement data pipelines using Azure Data Factory, Databricks, and Lakehouse technologies

- Collaborate with cross-functional teams to gather data requirements and implement solutions

- Optimize and enhance existing ETL workflows for improved performance and reliability

- Ensure data quality, integrity, and security

Required Skills:

- 10+ years of experience as a Data Stage, Teradata Developer

- Strong understanding of Teradata architecture and utilities (BTEQ, TPT, etc.)

- Experience with Azure Data Lake Storage (ADLS), Azure Synapse Analytics, and Databricks

- Proficiency in SQL, scripting languages (Unix, Python, etc.), and data modeling

- Familiarity with Lakehouse architecture and technologies

- Experience with Agile methodologies and version control systems (Git, etc.)

Preferred Skills:

- Experience with Data Stage (InfoSphere, etc.)

- Knowledge of data warehousing concepts and ETL methodologies

- Strong analytical and problem-solving skills

- Familiarity with Azure DevOps and CI/CD pipelines