Senior SAS Databricks Migration Engineer 1Peter2:7-8

Senior SAS Databricks Migration Engineer 1Peter2:7-8

Posted 2 days ago by 1761650546

Negotiable
Outside
Remote
USA

Summary: Cycle3 IT Staffing is looking for a Senior SAS Databricks Migration Engineer to manage the migration of SAS assets to Databricks on Delta Lake. The role involves end-to-end migration responsibilities, including code conversion, pipeline building, and performance tuning, while collaborating with various teams. The ideal candidate should have extensive experience in data engineering and Databricks, along with strong communication skills. This position is remote and classified as outside IR35.

Key Responsibilities:

  • Own end-to-end migrations from SAS to Databricks on Delta Lake.
  • Inventory SAS jobs, map dependencies, and define wave plans.
  • Rewrite DATA steps to PySpark/DataFrames and PROC SQL to Databricks SQL.
  • Build DLT/Jobs pipelines and workflows with retries and alerts.
  • Stand up Delta Lake tables with partitioning strategies.
  • Implement Unity Catalog for governance and auditability.
  • Tune performance and track costs with usage dashboards.
  • Design parity tests and coordinate BI/feeds switchovers.
  • Create runbooks and mentor client teams.
  • Assist in technical pre-sales scoping for SAS migrations.

Key Skills:

  • 8+ years in data engineering; 5+ years hands-on SAS experience.
  • 4+ years Databricks experience with proven SAS Spark/SQL conversions.
  • Strong Delta Lake fundamentals and Unity Catalog knowledge.
  • Experience building DLT or Jobs-based pipelines and workflows.
  • Cloud experience, preferably in Azure.
  • CI/CD with Git and familiarity with testing frameworks.
  • Excellent communication skills and experience running workshops.

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Cycle3 IT Staffing is seeking a Senior SAS Databricks Migration Engineer

The Role
Own end-to-end migrations from SAS (Base/PROC/DI/Grid/VA) to Databricks on Delta Lake. You ll inventory legacy assets, convert code and jobs, harden performance, and lead cutovers working closely with data, platform, and business teams.
What You ll Do
Discovery & Planning: Inventory SAS jobs, DATA steps, PROC SQL, macros, schedules, libraries; map dependencies and SLAs; define wave plans.

Code Conversion: Rewrite DATA steps to PySpark/DataFrames, PROC SQL to Databricks SQL; replace macros with parameterized notebooks/config.

Pipelines & Orchestration: Build DLT/Jobs pipelines (Bronze/Silver/Gold), Autoloader, and Workflows with retries, alerts, and SLA metrics.

Data Modeling & Storage: Stand up Delta Lake tables with OPTIMIZE/Z-ORDER, partitioning strategy, expectations/constraints.

Governance: Implement Unity Catalog (RBAC, tags, masking, lineage), secrets/key management, and auditability.

Performance & Cost: Tune joins, caching, Photon, cluster sizing; track cost with tags and usage dashboards.

Validation & Cutover: Design parity tests (row counts, checksums, KPIs), dual-run, reconcile, and coordinate BI/feeds switchovers.

Enablement: Create runbooks, code patterns, and handoff docs; mentor client teams.
Help in technical pre sales scoping For SAS migrations to Databricks

Must-Have Qualifications
8+ years in data engineering; 5+ years hands-on SAS (DATA step, PROC SQL/MACROs; DI/VA a plus).

4+ years Databricks (PySpark & SQL) with proven SAS Spark/SQL conversions in production.

Strong Delta Lake fundamentals (ACID, OPTIMIZE/VACUUM, schema evolution) and Unity Catalog.

Building DLT or Jobs-based pipelines, Workflows, widgets/parameters, secrets.

Cloud experience (Azure preferred; AWS/Google Cloud Platform acceptable): storage, networking basics, IAM.

CI/CD with Git; code reviews; testing frameworks (pytest, Great Expectations/Delta Expectations).

Excellent communication with business/SMEs; comfortable running workshops and cutovers.

Nice to Have
SAS Grid/DI Studio/VA migrations; scheduler migrations (Stonebranch/Airflow/ADF Workflows).

Data quality frameworks, observability/logging; Databricks Asset Bundles or dbx.

Domain experience in Healthcare, Financial Services, or Manufacturing.

BI integrations (Tableau/Power BI), CDC patterns, and performance benchmarking.