Data Engineer - Azure Data Factory & Databricks

Data Engineer - Azure Data Factory & Databricks

Posted 1 day ago by CBS Butler

£440 Per day
Inside
Remote
Reading, UK

Summary: The Data Engineer role focuses on leveraging Azure Data Factory and Databricks to support data transformation and migration for a national retail chain. The position requires a strong emphasis on data integrity, security, and the development of tools for data processing. The role is fully remote within the UK and is initially set for a duration of three months, with a high likelihood of extension. The successful candidate will work within a global IT consultancy environment, contributing to digital transformation initiatives.

Key Responsibilities:

  • Focus on data integrity, security, and the development of tools for data transformation and migration.
  • Document data processing logic according to agreed standards, including requirements and ETL logic.
  • Understand business data requirements and convert them into practical solutions for the organization.

Key Skills:

  • Strong technical ability with databases and data software (ETL packages) and programming languages (SQL & Python).
  • Proficient in understanding business processes to create and execute tool designs.
  • Experience with Azure Data Lake, Azure Data Factory, Azure Databricks, Azure SQL Server, and Azure DevOps/Git.
  • Excellent communication skills for discussing technical principles with various stakeholders.

Salary: £440 daily

City: Reading

Country: UK

Working Arrangements: remote

IR35 Status: inside IR35

Seniority Level: Senior

Industry: IT

Detailed Description From Employer:

Data Engineer - Azure Data Factory & Databricks

Rate: £400 - £440 a day (Inside IR35)
Location: Fully remote within the UK
Duration: Initially 3 months with extension highly likely.

You will join a global IT consultancy delivering digital transformation to a national retail chain.

Role description:

  • As a Senior Data Engineer your primary focus will be the data, the platform we handle data with, the integrity of data and the security of data. You will be instrumental in the analysis, design, build and repeatable testing of tools, frameworks and delivery for data transformation, data migrations and other data services artefacts
  • Document the data processing logic according to agreed documentation standards. This is to include requirements, solutions, reconciliation, execution, reporting, low level ETL logic, as well as other surrounding data services artefacts or infrastructure.
  • Fully understand business data requirements and seamlessly convert these into real solutions which can be utilised across the organisation

Key Skills

  • Strong technical ability and understanding of how databases work as well as the use of data software (ETL packages) and programming languages (usually SQL & Python) to develop tools to extract, transform and load data, as well as to document the creation of these tools.
  • Proficient capability to understand business processes in detail to apply knowledge, understand and create requirements, apply them to a tool design and execute its build
  • Experience in some or all of the following: Azure Data Lake, Azure Data Factory, Azure Databricks (inc. Python and SQL) Azure SQL Server, Azure DevOps/Git
  • Excellent communication skills as you will need to discuss technical principles and business processes in simple language to people at all levels