Data Engineer (Databricks)

Data Engineer (Databricks)

Posted 1 week ago by Infoplus Technologies UK Ltd

Negotiable
Undetermined
Undetermined
Leeds, UK

Summary: The role of Data Engineer (Databricks) focuses on the administration and optimization of the Databricks platform, specifically within GCP/AWS environments. The position requires extensive experience in data engineering and operational support, including platform configuration, resource monitoring, and collaboration with various teams. The candidate will be responsible for ensuring optimal performance and security of the Databricks environment while managing access controls and data schemas. A strong background in Databricks and cloud platform administration is essential for success in this role.

Key Responsibilities:

  • Responsible for the administration, configuration, and optimization of the Databricks platform to enable data analytics, machine learning, and data engineering activities within the organization.
  • Collaborate with the data engineering team to ingest, transform, and orchestrate data.
  • Manage privileges over the entire Databricks account, as well as at the workspace level, Unity Catalog level and SQL warehouse level.
  • Create workspaces, configure cloud resources, view usage data, and manage account identities, settings, and subscriptions.
  • Install, configure, and maintain Databricks clusters and workspaces.
  • Maintain Platform currency with security, compliance, and patching best practices.
  • Monitor and manage cluster performance, resource utilization, platform costs, and troubleshoot issues to ensure optimal performance.
  • Implement and manage access controls and security policies to protect sensitive data.
  • Manage schema data with Unity Catalog - create, configure, catalog, external storage, and access permissions.
  • Administer interfaces with Google Cloud Platform.

Key Skills:

  • 3+ years of production support of the Databricks platform
  • 2+ years of experience of AWS/Azure/GCP PaaS admin
  • 2+ years of experience in automation frameworks such as Terraform

Salary (Rate): undetermined

City: Leeds

Country: UK

Working Arrangements: undetermined

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Job Summary:

Primary skill - Databricks Admin with GCP/AWS

Mandatory Skills: Data Bricks - Data Engineering

Experience: 8-10 Years

  • Responsibilities will include designing, implementing, and maintaining the Databricks platform, and providing operational support. Operational support responsibilities include platform set-up and configuration, workspace administration, resource monitoring, providing technical support to data engineering, Data Science/ML, and Application/integration teams, performing restores/recoveries, troubleshooting service issues, determining the root causes of issues, and resolving issues.
  • The position will also involve the management of security and changes.
  • The position will work closely with the Team Lead, other Databricks Administrators, System Administrators, and Data Engineers/Scientists/Architects/Modelers/Analysts.

Responsibilities:

  • Responsible for the administration, configuration, and optimization of the Databricks platform to enable data analytics, machine learning, and data engineering activities within the organization.
  • Collaborate with the data engineering team to ingest, transform, and orchestrate data.
  • Manage privileges over the entire Databricks account, as well as at the workspace level, Unity Catalog level and SQL warehouse level.
  • Create workspaces, configure cloud resources, view usage data, and manage account identities, settings, and subscriptions.
  • Install, configure, and maintain Databricks clusters and workspaces.
  • Maintain Platform currency with security, compliance, and patching best practices.
  • Monitor and manage cluster performance, resource utilization, platform costs, and troubleshoot issues to ensure optimal performance.
  • Implement and manage access controls and security policies to protect sensitive data.
  • Manage schema data with Unity Catalog - create, configure, catalog, external storage, and access permissions.
  • Administer interfaces with Google Cloud Platform.

Required Skills:

  • 3+ years of production support of the Databricks platform

Preferred:

  • 2+ years of experience of AWS/Azure/GCP PaaS admin
  • 2+ years of experience in automation frameworks such as Terraform