Data Engineer - (DBT, Snowflake, PL/SQL, Data Modeling, Git, CI/CD, Airflow, dbt Cloud, Banking)

Data Engineer - (DBT, Snowflake, PL/SQL, Data Modeling, Git, CI/CD, Airflow, dbt Cloud, Banking)

Posted 2 days ago by GIOS Technology

Negotiable
Undetermined
Hybrid
Manchester Area, United Kingdom

Summary: The role of Data Engineer focuses on leveraging expertise in DBT, Snowflake, and PL/SQL to design and maintain data pipelines that support business intelligence and analytics. The position requires collaboration with various stakeholders to ensure data quality and performance optimization. The ideal candidate will contribute to the development of scalable data models and participate in code reviews and architecture discussions. This is a hybrid position based in Manchester, UK.

Key Responsibilities:

  • Design and implement scalable data models and transformation pipelines using DBT on Snowflake.
  • Develop and optimize complex data processes with PL/SQL.
  • Collaborate with analysts, scientists, and stakeholders to gather and deliver data requirements.
  • Optimize Snowflake performance through tuning, clustering, and resource management.
  • Ensure data quality, integrity, and governance through testing, documentation, and monitoring.
  • Contribute to code reviews, architecture discussions, and continuous improvements.

Key Skills:

  • DBT
  • Snowflake
  • PL/SQL
  • Data Modeling
  • Git
  • CI/CD
  • Airflow
  • dbt Cloud
  • Prefect

Salary (Rate): undetermined

City: Manchester

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

I am hiring for Data Engineer – DBT, Snowflake, PL/SQL

Location: Manchester, UK (Hybrid)

Job Description

We are seeking an experienced Data Engineer with strong expertise in DBT, Snowflake, and PL/SQL to join our growing data team. The ideal candidate will design, develop, and maintain robust data pipelines to support business intelligence, analytics, and data science initiatives.

Key Responsibilities

  • Design and implement scalable data models and transformation pipelines using DBT on Snowflake .
  • Develop and optimize complex data processes with PL/SQL .
  • Collaborate with analysts, scientists, and stakeholders to gather and deliver data requirements.
  • Optimize Snowflake performance through tuning, clustering, and resource management.
  • Ensure data quality, integrity, and governance through testing, documentation, and monitoring.
  • Contribute to code reviews, architecture discussions, and continuous improvements.

Key Skills:

  • DBT
  • Snowflake
  • PL/SQL
  • Data Modeling
  • Git
  • CI/CD
  • Airflow
  • dbt Cloud
  • Prefect