Platform Engineer

Platform Engineer

Posted 1 day ago by 1770823246

£400 Per day
Outside
Remote
England

Summary: The role of Platform Engineer involves collaborating with data engineering and software engineering teams to deliver strategic data solutions while promoting Agile and DevOps practices. The position requires designing data solutions based on product owners' requirements and developing data pipelines and cloud infrastructure. The engineer will also be responsible for educating the business on new data techniques and ensuring the maintainability of data engineering pipelines. This is a remote contract position with an outside IR35 status.

Key Responsibilities:

  • Working closely with the product owners to understand their data requirements and designing solutions for the same.
  • Enable product owners and management to derive value through modelling, sourcing and data transformation.
  • Enable product owners to understand the uptake of various products by building reports and visualisations from data.
  • Developing comprehensive knowledge of the data structures and metrics, advocating change where needed.
  • Educating and embedding new data techniques into the business through role modelling, training and experiment design oversight.
  • Delivering maintainable data engineering pipelines by following the best practices through the designated route-to-live environments.
  • Design, deploy, and manage cloud infrastructure (AWS/Azure/GCP) for data workloads using Terraform or CloudFormation.
  • Build and maintain CI/CD pipelines (GitHub Actions/GitLab CI) to automate the testing and deployment of SQL.

Key Skills:

  • Strong expertise with SQL.
  • IaC Tools: Experience defining infrastructure with Terraform, Ansible, or Pulumi.
  • Ability to design, build, and maintain data pipelines for data ingestion and transformation.
  • Demonstrable experience in Microsoft Azure and Microsoft Fabric (eg Data Factory, Azure SQL).
  • Experience of BI tools and Data processing frameworks such as Apache Spark - preferably using Python (PySpark) or Scala.
  • Experience working within cloud environments. Although Azure is preferred, AWS and GCP are also valuable.
  • Knowledge of version control working with git (GitLab, GitHub, Bitbucket, etc.).
  • Deep knowledge of Git flow, CI/CD methodologies, and automation tools (Jenkins, GitHub Actions, etc.).

Salary (Rate): £400 daily

City: undetermined

Country: England

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Platform Engineer | Outside IR35 | Remote | Starting ASAP

Day Rate: £400

Harvey Nash's Client require a contract Platform Engineer, You'll be working as a part of a team within Data Engineering and closely with software engineering teams and architects to deliver strategic data solutions, while driving Agile and DevOps adoption in the delivery of data engineering.

Responsibilities

  • Working closely with the product owners to understand their data requirements and designing solutions for the same.
  • Enable product owners and management to derive value through modelling, sourcing and data transformation.
  • Enable product owners to understand the uptake of various products by building reports and visualisations from data.
  • Developing comprehensive knowledge of the data structures and metrics, advocating change where needed.
  • Educating and embedding new data techniques into the business through role modelling, training and experiment design oversight.
  • Delivering maintainable data engineering pipelines by following the best practices through the designated route-to-live environments.
  • Design, deploy, and manage cloud infrastructure (AWS/Azure/GCP) for data workloads using Terraform or CloudFormation.
  • Build and maintain CI/CD pipelines (GitHub Actions/GitLab CI) to automate the testing and deployment of SQL.

Essential Skills & Experience:

  • Strong expertise with SQL.
  • IaC Tools: Experience defining infrastructure with Terraform, Ansible, or Pulumi.
  • Ability to design, build, and maintain data pipelines for data ingestion and transformation
  • Demonstrable experience in Microsoft Azure and Microsoft Fabric (eg Data Factory, Azure SQL).
  • Experience of BI tools and Data processing frameworks such as Apache Spark - preferably using Python (PySpark) or Scala.
  • Experience working within cloud environments. Although Azure is preferred, AWS and GCP are also valuable.
  • Knowledge of version control working with git (GitLab, GitHub, Bitbucket, etc.)
  • Deep knowledge of Git flow, CI/CD methodologies, and automation tools (Jenkins, GitHub Actions, etc.).

Desirable Skills/Experience

  • Experience with Automation Testing and Test-Driven Development.
  • Configuration management and containerization tools (eg, Docker, Kubernetes).
  • Experience or interest in Data Science.
  • Understanding of Data Quality and Data Governance best practices.
  • Experience with monitoring and logging tools.