Negotiable
Undetermined
Remote
Remote
Summary: The Snowflake Matillion Data Engineer role focuses on designing and developing data pipelines using Matillion ETL for Snowflake. The position requires building and optimizing Snowflake objects while implementing ELT best practices and ensuring data quality and security. Collaboration with BI tools and DevOps for CI/CD automation is also a key aspect of the role.
Key Responsibilities:
- Design and develop data pipelines using Matillion ETL for Snowflake
- Build and optimize Snowflake objects (tables, views, schemas, stages)
- Implement ELT best practices leveraging Snowflake's architecture
- Write efficient SQL, including CTEs, window functions, and optimizations
- Manage data ingestion from multiple sources (S3, Azure Blob, APIs, RDBMS)
- Optimize performance using clustering, partitioning, and query tuning
- Implement data quality, validation, and monitoring frameworks
- Work with BI tools (Tableau, Power BI, Looker) to support analytics use cases
- Collaborate with DevOps for CI/CD and environment automation
- Ensure data security, governance, and compliance
Key Skills:
- Experience with Matillion ETL
- Proficiency in Snowflake architecture and objects
- Strong SQL skills, including CTEs and window functions
- Knowledge of data ingestion techniques from various sources
- Experience in performance optimization strategies
- Familiarity with data quality and monitoring frameworks
- Experience with BI tools like Tableau, Power BI, or Looker
- Understanding of CI/CD processes
- Knowledge of data security and governance practices
Salary (Rate): £56,000 yearly
City: undetermined
Country: undetermined
Working Arrangements: remote
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
Key Responsibilities
Design and develop data pipelines using Matillion ETL for Snowflake
Build and optimize Snowflake objects (tables, views, schemas, stages)
Implement ELT best practices leveraging Snowflake s architecture
Write efficient SQL, including CTEs, window functions, and optimizations
Manage data ingestion from multiple sources (S3, Azure Blob, APIs, RDBMS)
Optimize performance using clustering, partitioning, and query tuning
Implement data quality, validation, and monitoring frameworks
Work with BI tools (Tableau, Power BI, Looker) to support analytics use cases
Collaborate with DevOps for CI/CD and environment automation
Ensure data security, governance, and compliance