Databricks Engineer with Snowflake - contract to hire or FTE - Remote - W2 only

Databricks Engineer with Snowflake - contract to hire or FTE - Remote - W2 only

Posted 5 days ago by 1755253110

Negotiable
Outside
Remote
USA

Summary: The role of Databricks Engineer involves building, configuring, and optimizing data solutions using the Databricks Lakehouse architecture. This position is 100% remote and requires strong technical skills in data engineering, particularly with Databricks, Snowflake, and cloud services. The ideal candidate should possess excellent communication skills and be comfortable collaborating with both technical and non-technical clients. The role may lead to a full-time hire based on performance and fit.

Key Responsibilities:

  • Build and maintain Databricks-based solutions, including lakehouse architecture, ETL/ELT pipelines, and streaming/batch data processing workflows.
  • Configure clusters, tune Spark jobs, apply efficient partitioning strategies, and manage autoscaling to ensure cost and performance efficiency.
  • Apply RBAC, Unity Catalog, data masking/encryption, and audit logging to meet compliance and security requirements.
  • Use Infrastructure-as-Code tools (Terraform, Bicep, ARM, CloudFormation) to automate Databricks environment setup and deployments.
  • Connect Databricks with Azure, AWS, or Google Cloud Platform services (e.g., Data Factory, Synapse, ADLS, S3, BigQuery).
  • Support data science and analytics teams with MLflow, Databricks SQL, feature store, and AutoML integrations.
  • Work closely with data architects, analysts, and business stakeholders to implement solutions based on defined requirements and designs.
  • Maintain technical documentation, troubleshoot platform issues, and contribute to best practices across the engineering team.

Key Skills:

  • 5+ years of experience working with enterprise cloud data platforms, including 2+ years of hands-on Databricks engineering work.
  • Strong knowledge of Apache Spark (PySpark), Delta Lake, and Databricks features like Unity Catalog and Workflows.
  • Proficiency in Python, SQL, and/or Scala for data modeling and transformation.
  • Experience with Azure, AWS, or Google Cloud Platform data services, including storage, networking, and security.
  • Familiarity with CI/CD pipelines and orchestration tools (Airflow, ADF, Databricks Workflows).
  • Strong problem-solving skills and ability to work in Agile environments.

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Hello! I'm searching for an experienced Databricks Engineer for a contract to hire or FTE role based in Columbus, OH. This role is 100% remote. Experience with Snowflake, Python, Unity Catalog, SQL, Data Flows, ETL, and Data Modeling would be helpful in this role. This person must be a self starter, a leader, and must be comfortable working with clients, both technical and non-technical. Good communication skills are a must. Hourly pay range for this position is $60-80 an hour (W2) based on experience and full time hire is a possibility as well. Job details below. If you are qualified and interested in learning more, please apply today!

This job is 100% remote. The closer to east coast time, the better.

You must be able to work in the United States without sponsorship to be eligible for this job.

You must be able to pass a background check and drug screen to be eligible for this job.

Databricks Engineer
A Databricks Engineer builds, configures, and optimizes scalable, secure, and high-performance data solutions using the Databricks Lakehouse architecture. This role is hands-on, focusing on developing data pipelines, implementing governance and security controls, and ensuring optimal performance across the platform.

Key Responsibilities

  • Development & Implementation: Build and maintain Databricks-based solutions, including lakehouse architecture, ETL/ELT pipelines, and streaming/batch data processing workflows.
  • Performance Optimization: Configure clusters, tune Spark jobs, apply efficient partitioning strategies, and manage autoscaling to ensure cost and performance efficiency.
  • Data Governance & Security: Apply RBAC, Unity Catalog, data masking/encryption, and audit logging to meet compliance and security requirements.
  • Infrastructure Automation: Use Infrastructure-as-Code tools (Terraform, Bicep, ARM, CloudFormation) to automate Databricks environment setup and deployments.
  • Cloud Integration: Connect Databricks with Azure, AWS, or Google Cloud Platform services (e.g., Data Factory, Synapse, ADLS, S3, BigQuery).
  • Advanced Analytics Enablement: Support data science and analytics teams with MLflow, Databricks SQL, feature store, and AutoML integrations.
  • Collaboration: Work closely with data architects, analysts, and business stakeholders to implement solutions based on defined requirements and designs.
  • Documentation & Support: Maintain technical documentation, troubleshoot platform issues, and contribute to best practices across the engineering team.

Required Skills & Experience

  • 5+ years of experience working with enterprise cloud data platforms, including 2+ years of hands-on Databricks engineering work.
  • Strong knowledge of Apache Spark (PySpark), Delta Lake, and Databricks features like Unity Catalog and Workflows.
  • Proficiency in Python, SQL, and/or Scala for data modeling and transformation.
  • Experience with Azure, AWS, or Google Cloud Platform data services, including storage, networking, and security.
  • Familiarity with CI/CD pipelines and orchestration tools (Airflow, ADF, Databricks Workflows).
  • Strong problem-solving skills and ability to work in Agile environments.

Preferred Qualifications

  • Certifications such as Databricks Certified Data Engineer Professional, Azure Data Engineer, or AWS/Google Cloud Platform equivalents.
  • Exposure to data mesh principles, data products, or multi-region data deployments.
  • Experience in regulated industries with strict compliance and governance needs.