Principal Data Architect & ETL Engineering Lead

Principal Data Architect & ETL Engineering Lead

Posted 1 day ago by 1763450830

Negotiable
Outside
Remote
USA

Summary: The Principal Data Architecture and ETL/ELT Lead is a hands-on technical role focused on designing and modernizing ETL/ELT processes to support enterprise analytics and business intelligence. This position requires a seasoned data engineer and architect who can build scalable, cloud-based data platforms while aligning data architecture with business objectives. The ideal candidate will lead the transition from legacy systems to modern data solutions and mentor engineering teams in best practices. This role is fully remote and requires expertise in various cloud technologies and data integration strategies.

Key Responsibilities:

  • Partner with senior leadership to translate business objectives into scalable data architecture strategies and align technology priorities with analytics and operational goals.
  • Architect and implement modern, cloud-native data pipelines (batch and streaming) using Snowflake, AWS, Informatica Cloud, and Python.
  • Lead the modernization of ETL/ELT workflows, guiding the transition from Informatica to dbt, Dagster, and Airflow for flexible, version-controlled data transformation and orchestration.
  • Design and maintain enterprise data models and domain architectures, integrating diverse systems such as Salesforce, NetSuite, and SAP into a unified, governed data platform.
  • Develop and maintain API-driven integrations, including REST/SOAP and event-driven architectures, to enable seamless data flow across applications.
  • Establish and enforce data engineering best practices including testing, CI/CD, monitoring, and performance optimization.
  • Collaborate closely with data engineers, analysts, and product teams to translate requirements into scalable, automated technical solutions.
  • Mentor and coach data engineers, fostering strong coding, documentation, and operational standards.
  • Champion data governance, lineage, and observability, ensuring all data flows are reliable, auditable, and well-documented.

Key Skills:

  • 10+ years of hands-on experience in data architecture, data engineering, and ETL/ELT development at enterprise scale.
  • Proven expertise with Snowflake, AWS (Lambda, Glue, S3, Kinesis), and Informatica Cloud.
  • Strong proficiency in SQL and Python for automation, transformation, and data quality validation.
  • Experience integrating platforms such as Salesforce, SAP Commerce Cloud, NetSuite, and third-party APIs into centralized data warehouses.
  • Familiarity with modern data transformation and orchestration tools such as dbt, Dagster, and Apache Airflow.
  • Deep understanding of data governance, semantic modeling, and metadata management (e.g., Collibra, Alation).
  • Track record of leading architecture and engineering delivery end-to-end from concept through production.
  • Bachelor's degree in Computer Science, Engineering, Mathematics, or related discipline.
  • Snowflake SnowPro Core or Data Engineer certification preferred.

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:
  • Contract to Hire opportunity
  • No sponsorship is offered at this time
  • REMOTE working EST hours

Position Summary

We re seeking a highly technical, hands-on Principal Data Architecture and ETL/ELT Lead to drive the design, development, and modernization of ETL/ELT processes in support of enterprise analytics and business intelligence. This role is ideal for a seasoned data engineer and architect who enjoys building scalable, cloud-based data platforms and guiding the evolution of data integration strategy.

You ll play a pivotal role in aligning enterprise data architecture with business objectives designing robust data pipelines, optimizing data flow, and leading the transition from legacy systems to modern, code-driven data solutions. The ideal candidate is both an architect and builder comfortable writing code, mentoring engineers, and implementing best-in-class data practices.

Key Responsibilities

  • Partner with senior leadership to translate business objectives into scalable data architecture strategies and align technology priorities with analytics and operational goals.
  • Architect and implement modern, cloud-native data pipelines (batch and streaming) using Snowflake, AWS, Informatica Cloud, and Python.
  • Lead the modernization of ETL/ELT workflows, guiding the transition from Informatica to dbt, Dagster, and Airflow for flexible, version-controlled data transformation and orchestration.
  • Design and maintain enterprise data models and domain architectures, integrating diverse systems such as Salesforce, NetSuite, and SAP into a unified, governed data platform.
  • Develop and maintain API-driven integrations, including REST/SOAP and event-driven architectures, to enable seamless data flow across applications.
  • Establish and enforce data engineering best practices including testing, CI/CD, monitoring, and performance optimization.
  • Collaborate closely with data engineers, analysts, and product teams to translate requirements into scalable, automated technical solutions.
  • Mentor and coach data engineers, fostering strong coding, documentation, and operational standards.
  • Champion data governance, lineage, and observability, ensuring all data flows are reliable, auditable, and well-documented.

Required Qualifications

  • 10+ years of hands-on experience in data architecture, data engineering, and ETL/ELT development at enterprise scale.
  • Proven expertise with Snowflake, AWS (Lambda, Glue, S3, Kinesis), and Informatica Cloud.
  • Strong proficiency in SQL and Python for automation, transformation, and data quality validation.
  • Experience integrating platforms such as Salesforce, SAP Commerce Cloud, NetSuite, and third-party APIs into centralized data warehouses.
  • Familiarity with modern data transformation and orchestration tools such as dbt, Dagster, and Apache Airflow.
  • Deep understanding of data governance, semantic modeling, and metadata management (e.g., Collibra, Alation).
  • Track record of leading architecture and engineering delivery end-to-end from concept through production.
  • Bachelor s degree in Computer Science, Engineering, Mathematics, or related discipline.
  • Snowflake SnowPro Core or Data Engineer certification preferred.

Preferred Attributes

  • Passion for working with data at scale and enabling teams to deliver trustworthy, analytics-ready data.
  • Ability to thrive in a hybrid strategic/technical role that balances hands-on engineering with architectural leadership.
  • Experience building CI/CD pipelines, version-controlled transformations, and testable data workflows using dbt and GitHub Actions (or similar tools).
  • Strong communicator, capable of explaining complex technical designs to both technical and non-technical stakeholders.