Freelance - Data Engineer

Freelance - Data Engineer

Posted 2 weeks ago by 1748619914

£70 Per hour
Undetermined
Remote
Location Amsterdam, North Holland, Netherlands

Summary: The role is for a Freelance Data Engineer with approximately 5 years of experience, focusing on GCP, Data Mesh, and BigQuery. The position requires advanced skills in data products, data access management, and communication to advocate for data mesh practices. The candidate should also have experience with CI/CD, DataOps, and SQL for pipeline optimization. The role is based in Amsterdam and offers a competitive hourly rate.

Key Responsibilities:

  • Implement data products and data mesh, including data contracts and access management.
  • Utilize BigQuery for advanced analytics and data access management.
  • Communicate and advocate for data mesh practices among colleagues.
  • Work with GCP tools for data pipelines, including Cloud Run, Workflows, and Dataflow.
  • Manage CI/CD and DataOps processes using GitHub Actions and Terraform.
  • Develop data modeling and transformation pipelines using dbt.
  • Optimize SQL pipelines for performance.
  • Engage in stakeholder management with software engineers and product owners.

Key Skills:

  • 5 years of experience in GCP, Data Mesh, and BigQuery.
  • Advanced knowledge of data products and data access management.
  • Strong communication and interpersonal skills.
  • Experience with CI/CD and DataOps tools.
  • Proficiency in dbt for data modeling.
  • Advanced SQL skills for pipeline optimization.
  • Stakeholder management experience.

Salary (Rate): £70 hourly

City: Amsterdam

Country: Netherlands

Working Arrangements: remote

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Hi there,

Currently, we are looking for a Freelance Data Engineer with around 5 years of experience in GCP, Data Mesh and BigQuery.

Desired knowledge, experience, competence, skills etc

  • Data products and data mesh knowledge: experience with practical implementation from levels and schemas to data contracts and data access management
  • BigQuery: advanced experience, ideally in-depth Analytics Hub experience for data access management
  • Communication and interpersonal skills to help advocate for the data mesh, but also to guide colleagues technically
  • GCP in general: especially tools used for data pipelines like Cloud Run (jobs), Workflows; Dataflow is a big plus
  • CI/CD and DataOps: GitHub Actions, Observability/Monitoring, IaC with Terraform
  • dbt: for data modelling and transformation pipelines
  • SQL: advanced skills for pipeline optimisation,
  • Stakeholder management (software engineers, POs)

Practical information:

Start ASAP

Hourly rate: 70-85 euro's per hour

Location: Amsterdam

Feel free to apply via options below. For questions: J.vanderheide(@)huxley.nl