Sr.Databricks Engineer (AWS)

Sr.Databricks Engineer (AWS)

Posted 1 week ago by eTeam

£402 Per day
Inside
Undetermined
Glasgow, Scotland, United Kingdom

Summary: The Sr. Databricks Engineer (AWS) role in Glasgow involves leading the migration of data pipelines from AWS to Databricks. This hands-on position requires designing, building, and optimizing scalable data solutions on the Databricks platform while collaborating with cross-functional teams. The engineer will also ensure data quality and implement best practices for governance and security. Technical mentorship to junior engineers is also a key aspect of the role.

Key Responsibilities:

  • Lead the migration of existing AWS-based data pipelines to Databricks.
  • Design and implement scalable data engineering solutions using Apache Spark on Databricks.
  • Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines.
  • Optimize performance and cost-efficiency of Databricks workloads.
  • Develop and maintain CI/CD workflows for Databricks using GitLab or similar tools.
  • Ensure data quality and reliability through robust unit testing and validation frameworks.
  • Implement best practices for data governance, security, and access control within Databricks.
  • Provide technical mentorship and guidance to junior engineers.

Key Skills:

  • Strong hands-on experience with Databricks and Apache Spark (preferably PySpark).
  • Proven track record of building and optimizing data pipelines in cloud environments.
  • Experience with AWS services such as S3, Glue, Lambda, Step Functions, Athena, IAM, and VPC.
  • Proficiency in Python for data engineering tasks.
  • Familiarity with GitLab for version control and CI/CD.
  • Strong understanding of unit testing and data validation techniques.
  • Experience with Databricks Delta Lake, Unity Catalog, and MLflow.
  • Knowledge of CloudFormation or other infrastructure-as-code tools.
  • AWS or Databricks certifications.
  • Experience in large-scale data migration projects.
  • Background in Finance Industry.

Salary (Rate): £402/day

City: Glasgow

Country: United Kingdom

Working Arrangements: undetermined

IR35 Status: inside IR35

Seniority Level: Senior

Industry: IT

Detailed Description From Employer:

Role Title: Sr.Databricks Engineer (AWS) Location: Glasgow Duration: 31/12/2026 Days on site: 2-3 Rate: £402/day on Umbrella Role Description: We are currently migrating our data pipelines from AWS to Databricks, and are seeking a Senior Databricks Engineer to lead and contribute to this transformation. This is a hands-on engineering role focused on designing, building, and optimizing scalable data solutions using the Databricks platform.

Key Responsibilities:

  • Lead the migration of existing AWS-based data pipelines to Databricks.
  • Design and implement scalable data engineering solutions using Apache Spark on Databricks.
  • Collaborate with cross-functional teams to understand data requirements and translate them into efficient pipelines.
  • Optimize performance and cost-efficiency of Databricks workloads.
  • Develop and maintain CI/CD workflows for Databricks using GitLab or similar tools.
  • Ensure data quality and reliability through robust unit testing and validation frameworks.
  • Implement best practices for data governance, security, and access control within Databricks.
  • Provide technical mentorship and guidance to junior engineers.

Must-Have Skills:

  • Strong hands-on experience with Databricks and Apache Spark (preferably PySpark).
  • Proven track record of building and optimizing data pipelines in cloud environments.
  • Experience with AWS services such as S3, Glue, Lambda, Step Functions, Athena, IAM, and VPC.
  • Proficiency in Python for data engineering tasks.
  • Familiarity with GitLab for version control and CI/CD.
  • Strong understanding of unit testing and data validation techniques.

Preferred Qualifications:

  • Experience with Databricks Delta Lake, Unity Catalog, and MLflow.
  • Knowledge of CloudFormation or other infrastructure-as-code tools.
  • AWS or Databricks certifications.
  • Experience in large-scale data migration projects.
  • Background in Finance Industry.