Negotiable
Undetermined
Undetermined
Cambridge, England, United Kingdom
Summary: The Data Warehouse Specialist role requires a candidate with over 5 years of experience in Snowflake, DBT, Python, and AWS to develop and optimize ETL/ELT pipelines. The position involves designing data warehouse architecture, implementing data transformation workflows, and ensuring data quality and reliability. Collaboration with data engineers, analysts, and stakeholders is essential to meet data needs effectively. Certifications in AWS, Snowflake, or DBT are advantageous.
Key Responsibilities:
- Design, build, and optimize ETL/ELT pipelines using DBT and Snowflake.
- Implement data transformation workflows using DBT (core/cloud).
- Create automation scripts and optimize data processing tasks using Python.
- Troubleshoot and optimize DBT models and Snowflake performance.
- Ensure data quality, reliability, and consistency across different environments.
- Collaborate with data engineers, data analysts, and business stakeholders to understand data needs.
- Utilize CI/CD and version control (Git) tools.
- Work in an agile development environment independently.
Key Skills:
- Strong experience with Snowflake, DBT, Python, and AWS.
- Proficiency in SQL performance tuning and query optimization.
- Experience with orchestration tools such as Airflow.
- Strong analytical and problem-solving skills.
- Knowledge of CI/CD and version control (Git) tools.
- Certification in AWS, Snowflake, or DBT is a plus.
Salary (Rate): undetermined
City: Cambridge
Country: United Kingdom
Working Arrangements: undetermined
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT