Technology Architect (Cloud Integration - Azure Data Factory)

Technology Architect (Cloud Integration - Azure Data Factory)

Posted Today by eTeam Workforce Limited

Negotiable
Inside
Hybrid
London, UK

Summary: The Technology Architect role focuses on cloud integration using Azure Data Factory, requiring strong data engineering skills, particularly in PySpark and Databricks. The position is based in London with a hybrid working arrangement, involving 12 days in the office per month. The contract is for a duration of 6 months and falls under inside IR35 regulations.

Key Responsibilities:

  • Develop and manage data pipelines and orchestration using Azure Data Factory.
  • Implement data transformation processes and Medallion architecture with Databricks.
  • Conduct end-to-end data engineering, including conceptual diagrams and models.
  • Perform data modelling, including dimensional modelling and schema design.
  • Utilize CI/CD practices with GitHub Actions/Workflows.
  • Document source-to-target mapping and performance tuning during data transformation.
  • Engage in requirement gathering and communicate effectively with clients and engineering teams.
  • Collaborate with cross-functional teams to solve problems and manage technical projects.

Key Skills:

  • Strong hands-on experience with PySpark.
  • Proficiency in Azure Data Factory (ADF) and Databricks.
  • Experience in data engineering and Medallion architecture using Azure services.
  • Knowledge of data modelling techniques and tools like Erwin.
  • Familiarity with CI/CD processes and tools.
  • Proficient in Python, JSON, and YAML for configuration and pipelines.
  • Ability to write KQL queries and perform Log Analytics (nice-to-have).
  • Strong client-facing communication and problem-solving skills.

Salary (Rate): £387/day

City: London

Country: UK

Working Arrangements: hybrid

IR35 Status: inside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Role: Technology Architect (Cloud Integration - Azure Data Factory)
Location: London (Hybrid - 12 days WFO/month)
Rate: 387 pounds/day on inside Ir35
Duration: 6 months

Must-Have Skills:

  • PySpark - Strong hands-on data engineering experience
  • Azure Data Factory (ADF) - Pipelines, orchestration

  • Databricks - Data transformation & Medallion architecture

Technical Skills & Experience (10-15 Years):

  • Data Engineering (end-to-end, including conceptual diagrams, models)

  • Medallion Architecture using Azure services

  • Data modelling: Dimensional modelling, normalization, data harmonization, schema design

  • Erwin for data modelling

  • CI/CD (GitHub Actions/Workflows)

  • Python, JSON, YAML for config & pipelines

  • Performance tuning during transformation

  • Source-to-target mapping documentation

  • KQL queries & Log Analytics (nice-to-have)

Soft Skills:

  • Strong client-facing communication

  • Requirement gathering & explanation to engineering team

  • Problem-solving & technical project management

  • Collaboration with cross-functional teams