Data Engineer - Remote / Telecommute

Data Engineer - Remote / Telecommute

Posted 1 day ago by 1753965899

Negotiable
Outside
Remote
USA

Summary: The Data Engineer role focuses on advanced technical skills in SQL, Python, and Airflow to manage data pipelines and ensure platform stability. The position involves migrating legacy systems to modern platforms, debugging upstream dependencies, and collaborating across teams to maintain data governance and infrastructure hygiene. The role is fully remote and classified as outside IR35.

Key Responsibilities:

  • Manage advanced SQL tasks including table management and data querying.
  • Develop Python scripts for automation and ETL workflows.
  • Create and manage Airflow DAGs and dependencies.
  • Implement version control and CI/CD practices using Git.
  • Configure monitoring and observability tools.
  • Conduct root cause analysis for alert triggers.
  • Migrate legacy platforms to modern solutions.
  • Debug upstream dependencies and resolve data source failures.
  • Document processes and transfer pipeline ownership.
  • Collaborate with cross-functional teams on data governance.
  • Maintain data infrastructure hygiene and perform table deprecation.
  • Consolidate alerts to streamline monitoring processes.

Key Skills:

  • Advanced SQL (Snowflake, Databricks)
  • Python scripting
  • Airflow management
  • Version Control & CI/CD (Git)
  • Monitoring & Observability tools
  • Monte Carlo (MC) configuration
  • Root Cause Analysis
  • Technical Documentation skills
  • Cross-team collaboration
  • Data Governance awareness
  • Data Infrastructure hygiene

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:
Job Description:
Core Technical Skills:
  • Advanced SQL (Snowflake, Databricks): Table management, deprecation, data querying.
  • Python: Scripting for automation, ETL workflows, alert tooling.
  • Airflow: DAG creation, dependency management, alert tuning.
  • Version Control & CI/CD: Git, deployment pipelines, code reviews
  • Monitoring & Observability.
  • Monte Carlo (MC): Alert configuration, suppression, false positive reduction.
  • Observability Tooling: Integration with Airflow, Datadog, or similar tools.
  • Root Cause Analysis: Debugging alert triggers and noisy pipelines.
Platform Migration And Pipeline Stability:
  • Legacy to Modern Platform Migration: (e.g., RDE Alchemist or Data Infra).
  • Upstream Dependency Debugging: Identify and resolve R+ or external data source failures.
  • Pipeline Ownership Handoff: Documentation, transfer of Gold and People Analytics pipelines.
  • Process And Documentation:
  • Technical Documentation: Wikis, runbooks, alert resolution docs.
  • Cross-team Collaboration: Working with FDE, Data Infra, Storefront, CX, GCO.
  • Data Governance Awareness: Ownership models, process compliance, alert accountability.
  • Data Infrastructure Hygiene.
  • Table Deprecation & Cleanup: Snowflake, Salesforce, unused pipelines.
  • Alert Consolidation: Eliminate redundant monitors and streamline alerting logic.