Negotiable
Outside
Remote
USA
Summary: The Data Engineer role focuses on leveraging Snowflake and Matillion ETL to design, build, and maintain scalable data pipelines. The position requires expertise in data modeling, transformation, and cloud technologies to ensure high-quality data for decision-making. The candidate will collaborate with cross-functional teams to deliver effective data solutions while optimizing performance and ensuring data integrity. This is a 6-month contract-to-hire position that is fully remote.
Key Responsibilities:
- Design and Build Data Pipelines: Create, implement, and optimize ETL/ELT pipelines using Matillion for ingesting and processing data from multiple sources into Snowflake.
- Snowflake Architecture Design: Develop and maintain scalable Snowflake environments, including database design, warehouse management, and performance tuning to support complex queries.
- Data Transformation: Develop and maintain robust transformations in Matillion to ensure that raw data is cleansed, enriched, and modeled for business consumption.
- Data Integration: Collaborate with stakeholders to integrate a variety of data sources (e.g., APIs, flat files, databases) into Snowflake and ensure data is stored accurately and securely.
- Performance Optimization: Monitor and manage performance of ETL pipelines, optimize the use of compute resources in Snowflake, and improve query performance to enhance speed and efficiency.
- Collaboration: Work closely with cross-functional teams, including Data Scientists, Analysts, and Developers, to deliver data solutions that meet business needs.
- Quality Assurance: Ensure data quality through rigorous testing, validation, and adherence to best practices for data governance and security requirements.
- Documentation: Maintain detailed documentation of ETL workflows, data models, and processes to ensure transparency and facilitate support.
Key Skills:
- 7+ of experience working as a Data Engineer
- Expertise in Snowflake: Strong experience with Snowflake data warehouse platform, including architecture, performance tuning, and security.
- Matillion ETL Expertise: Hands-on experience with Matillion for developing and managing scalable data pipelines.
- SQL Proficiency: Advanced proficiency in SQL, including query optimization and debugging.
- Data Modeling: Strong knowledge of dimensional and relational database modeling principles for building efficient Snowflake databases.
- Cloud Technologies: Experience with cloud platforms (Azure Preferred but not a requirement) like AWS, Azure, or Google Cloud, including relevant services (e.g., S3, Lambda, Data Factory, BigQuery).
- Experience with APIs and Data Integration: Familiarity with integrating REST APIs and other data sources into the Snowflake ecosystem.
- Pipeline Automation: Knowledge of pipeline orchestration tools or workflows (e.g., Airflow, DBT).
- Problem-Solving Skills: Ability to troubleshoot data-related issues, identify root causes, and implement solutions.
- Soft Skills: Strong communication and collaboration skills to interact with technical and non-technical stakeholders.
Salary (Rate): undetermined
City: undetermined
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Location: 100% Remote
Duration: 6 Months contract to hire
Job Description
- We are seeking a highly skilled Snowflake Data Engineer with expertise in Matillion ETL to join our client's dynamic data engineering team.
- The ideal candidate will play a crucial role in designing, building, and maintaining scalable and high-performance data pipelines within the Snowflake ecosystem using Matillion.
- This role requires expertise in data modeling, data transformation, and cloud technologies while focusing on ensuring quality, performance, and accuracy for data-driven decision-making.
- Design and Build Data Pipelines: Create, implement, and optimize ETL/ELT pipelines using Matillion for ingesting and processing data from multiple sources into Snowflake.
- Snowflake Architecture Design: Develop and maintain scalable Snowflake environments, including database design, warehouse management, and performance tuning to support complex queries.
- Data Transformation: Develop and maintain robust transformations in Matillion to ensure that raw data is cleansed, enriched, and modeled for business consumption.
- Data Integration: Collaborate with stakeholders to integrate a variety of data sources (e.g., APIs, flat files, databases) into Snowflake and ensure data is stored accurately and securely.
- Performance Optimization: Monitor and manage performance of ETL pipelines, optimize the use of compute resources in Snowflake, and improve query performance to enhance speed and efficiency.
- Collaboration: Work closely with cross-functional teams, including Data Scientists, Analysts, and Developers, to deliver data solutions that meet business needs.
- Quality Assurance: Ensure data quality through rigorous testing, validation, and adherence to best practices for data governance and security requirements.
- Documentation: Maintain detailed documentation of ETL workflows, data models, and processes to ensure transparency and facilitate support.
- 7+ of experience working as a Data Engineer
- Expertise in Snowflake: Strong experience with Snowflake data warehouse platform, including architecture, performance tuning, and security.
- Matillion ETL Expertise: Hands-on experience with Matillion for developing and managing scalable data pipelines.
- SQL Proficiency: Advanced proficiency in SQL, including query optimization and debugging.
- Data Modeling: Strong knowledge of dimensional and relational database modeling principles for building efficient Snowflake databases.
- Cloud Technologies: Experience with cloud platforms (Azure Preferred but not a requirement) like AWS, Azure, or Google Cloud, including relevant services (e.g., S3, Lambda, Data Factory, BigQuery).
- Experience with APIs and Data Integration: Familiarity with integrating REST APIs and other data sources into the Snowflake ecosystem.
- Pipeline Automation: Knowledge of pipeline orchestration tools or workflows (e.g., Airflow, DBT).
- Problem-Solving Skills: Ability to troubleshoot data-related issues, identify root causes, and implement solutions.
- Soft Skills: Strong communication and collaboration skills to interact with technical and non-technical stakeholders.