W2 - Sr Data Engineer (MS Fabric, ETL/ELT Pipeline, SQL, Python, Cloud, Metadata, Healthcare) - Remote

W2 - Sr Data Engineer (MS Fabric, ETL/ELT Pipeline, SQL, Python, Cloud, Metadata, Healthcare) - Remote

Posted 1 day ago by 1763641780

Negotiable
Outside
Remote
USA

Summary: The role of Sr Data Engineer focuses on designing and implementing a Radiology data warehouse and analytics environment using Microsoft Fabric. The position requires expertise in data architecture, pipeline development, and modeling to support analytics and reporting initiatives. The engineer will work collaboratively with internal teams and mentor junior engineers while ensuring data quality and governance. This is a fully remote position with potential for contract-to-hire opportunities.

Key Responsibilities:

  • Implement data pipelines using best practices for ETL / ELT, data management, and data governance.
  • Analyze and process complex data sources in a fast-paced environment.
  • Perform data modeling against large data sets for peak efficiency.
  • Identify, design, and implement process improvement solutions that automate manual processes.
  • Understand and incorporate data quality principles to ensure optimal reliability and user experience.
  • Partner across teams to support cross-platform operations.
  • Create and document functional and technical specifications.
  • Drive exploration of new features and technologies, providing recommendations to enhance offerings.
  • Mentor junior engineers within the team.

Key Skills:

  • Bachelor's degree in Computer Science, Information Technology, or related field; OR equivalent 5+ years of experience.
  • 5+ years of hands-on experience programming in SQL.
  • 3+ years of experience building and maintaining automated data pipelines using batch and/or streaming processes.
  • Microsoft Fabric proficiency - hands-on experience with Data Warehouse, Data Pipelines (ETL/ELT), OneLake, Real-Time Analytics, and Power BI integration.
  • Strong understanding of dimensional/star schema and semantic model design.
  • Experience orchestrating robust, parameterized, and testable data pipelines.
  • Expertise in query optimization, performance tuning, indexing, and complex data transformations.
  • Proficiency in Python (or similar scripting language) for data wrangling and automation.
  • Proven ability to design scalable cloud data architectures.
  • Experience with metadata, lineage, and governance.
  • Understanding of infrastructure-as-code and CI/CD principles.
  • Experience working with clinical, imaging, or research data in a healthcare setting.
  • Awareness of data privacy, security, and compliance requirements in healthcare.
  • Strong communication skills with both technical and non-technical audiences.
  • Commitment to mentoring and knowledge sharing.
  • Data engineering certifications or cloud certifications helpful, but not required.

Salary (Rate): £48.00 hourly

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Duties: Implement data pipelines using best practices for ETL / ELT, data management, and data governance. Analyze and process complex data sources in a fast-paced environment. Perform data modeling against large data sets for peak efficiency. Identify, design, and implement process improvement solutions that automate manual processes and leverage standard frameworks and methodologies. Understand and incorporate data quality principles that ensure optimal reliability, impact, and user experience. Partner across teams to support cross-platform operations. Create and document functional and technical specifications. Drive exploration of new features, versions, and related technologies, and provide recommendations to enhance our offerings. Mentor junior engineers within the team

Education: Bachelor's degree in Computer Science, Information Technology or related field; OR equivalent 5+ years of experience. 5+ years of hands-on experience programming in SQL. 3+ years of experience building and maintaining automated data pipelines and data assets using batch and/or streaming processes

Schedule Notes:

Scope: We are looking for an individual to join the Radiology Data and Analytics team in designing and implementing a Radiology data warehouse and analytics environment using Microsoft Fabric. Working collaboratively with internal engineers and analysts, the individual will contribute expertise in data architecture, pipeline development, and modeling to integrate institutional and Radiology data sources that support analytics, reporting, and AI initiatives.

Skills/Experience:

  • Microsoft Fabric proficiency - hands-on experience with Data Warehouse, Data Pipelines (ETL/ELT), OneLake, Real-Time Analytics, and Power BI integration.
  • Data modeling - strong understanding of dimensional/star schema and semantic model design to support reporting and analytics use cases.
  • ETL/ELT pipeline development - experience orchestrating robust, parameterized, and testable data pipelines for ingestion, transformation, and curation using Fabric Data Pipelines or equivalent tools.
  • Advanced SQL/T-SQL - expertise in query optimization, performance tuning, indexing, and complex data transformations.
  • Programming/Scripting - proficiency in Python (or similar scripting language) for data wrangling, automation, and integration tasks.
  • Proven ability to design scalable cloud data architectures aligned with institutional standards and security frameworks.
  • Experience with metadata, lineage, and governance.
  • Understanding of infrastructure-as-code and CI/CD principles for data deployment and version control (e.g., Git, Azure DevOps, Terraform).
  • Experience working with clinical, imaging, or research data in a healthcare setting.
  • Awareness of data privacy, security, and compliance requirements in healthcare (HIPAA, PHI handling, de-identification).
  • Demonstrated success working as part of cross-functional teams, including IT, data science, and clinical stakeholders.
  • Strong communication skills with both technical and non-technical audiences.
  • Commitment to mentoring and knowledge sharing, helping build Radiology's internal capabilities in Microsoft Fabric and data engineering.
  • Data engineering certifications or cloud certifications helpful, but not required.

100% remote - equipment will be provided

A request from Radiology to consider individuals that would be easy to hire into a client position if the candidate works well. There will be a position opening up in the next year that the individual could be a solid candidate to apply toward. (Potential for contract-to-hire)

Hours Per Day: 8.00

Hours Per Week 40.00

Pay rate: $60/hr on W2