Senior Data Engineer

Senior Data Engineer

Posted 5 days ago by Lorien

Negotiable
Inside
Undetermined
Glasgow, Scotland, United Kingdom

Summary: The Senior Databricks Engineer role involves leading the migration of data pipelines from AWS to Databricks, focusing on designing and optimizing scalable data solutions. This hands-on position requires collaboration with cross-functional teams to ensure efficient data processing and governance. The engineer will also mentor junior staff and implement best practices for data quality and security. The role is based in Glasgow and requires PAYE through an umbrella company.

Key Responsibilities:

  • Lead the migration of existing AWS-based data pipelines to Databricks.
  • Design and implement scalable data engineering solutions using Apache Spark on Databricks.
  • Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines.
  • Optimize performance and cost-efficiency of Databricks workloads.
  • Develop and maintain CI/CD workflows for Databricks using GitLab or similar tools.
  • Ensure data quality and reliability through robust unit testing and validation frameworks.
  • Implement best practices for data governance, security, and access control within Databricks.
  • Provide technical mentorship and guidance to junior engineers.

Key Skills:

  • Strong hands-on experience with Databricks and Apache Spark (preferably PySpark).
  • Proven track record of building and optimizing data pipelines in cloud environments.
  • Experience with AWS services such as S3, Glue, Lambda, Step Functions, Athena, IAM, and VPC.
  • Proficiency in Python for data engineering tasks.
  • Familiarity with GitLab for version control and CI/CD.
  • Strong understanding of unit testing and data validation techniques.
  • Experience with Databricks Delta Lake, Unity Catalog, and MLflow.
  • Knowledge of CloudFormation or other infrastructure-as-code tools.
  • AWS or Databricks certifications.
  • Experience in large-scale data migration projects.
  • Background in Finance Industry.

Salary (Rate): undetermined

City: Glasgow

Country: United Kingdom

Working Arrangements: undetermined

IR35 Status: inside IR35

Seniority Level: Senior

Industry: IT

Detailed Description From Employer:

Role Title: Sr. Databricks Engineer

Location: Glasgow

Duration: 31/12/2026

Days on site: 2-3

MUST BE PAYE THROUGH UMBRELLA

Role Description: We are currently migrating our data pipelines from AWS to Databricks, and are seeking a Senior Databricks Engineer to lead and contribute to this transformation. This is a hands-on engineering role focused on designing, building, and optimizing scalable data solutions using the Databricks platform.

Key Responsibilities:

  • Lead the migration of existing AWS-based data pipelines to Databricks.
  • Design and implement scalable data engineering solutions using Apache Spark on Databricks.
  • Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines.
  • Optimize performance and cost-efficiency of Databricks workloads.
  • Develop and maintain CI/CD workflows for Databricks using GitLab or similar tools.
  • Ensure data quality and reliability through robust unit testing and validation frameworks.
  • Implement best practices for data governance, security, and access control within Databricks.
  • Provide technical mentorship and guidance to junior engineers.

Must-Have Skills:

  • Strong hands-on experience with Databricks and Apache Spark (preferably PySpark).
  • Proven track record of building and optimizing data pipelines in cloud environments.
  • Experience with AWS services such as S3, Glue, Lambda, Step Functions, Athena, IAM, and VPC.
  • Proficiency in Python for data engineering tasks.
  • Familiarity with GitLab for version control and CI/CD.
  • Strong understanding of unit testing and data validation techniques.

Preferred Qualifications:

  • Experience with Databricks Delta Lake, Unity Catalog, and MLflow.
  • Knowledge of CloudFormation or other infrastructure-as-code tools.
  • AWS or Databricks certifications.
  • Experience in large-scale data migration projects.
  • Background in Finance Industry.