Negotiable
Outside
Hybrid
USA
Summary: The Principal Data Engineer role involves leading the design and implementation of enterprise-scale data solutions using Snowflake and cloud data platforms such as Azure and Google Cloud. The position requires extensive experience in data engineering, particularly with Snowflake, and includes responsibilities such as mentoring teams, optimizing data processes, and driving cloud migrations. The ideal candidate will possess strong technical skills and the ability to solve complex data challenges. This is a hybrid position based in Jersey City, NJ.
Key Responsibilities:
- Architect and implement high-performance Snowflake solutions (RBAC, CDC, query optimization)
- Design scalable ETL frameworks using Spark, Python, and cloud-native services
- Lead end-to-end data projects from ingestion to consumption (8-10 member teams)
- Solve complex data challenges e.g., "How would you optimize a slowly changing dimension process handling 10TB daily in Snowflake?"
- Implement data quality frameworks with profiling, STTM, and reusable validation modules
- Build CI/CD pipelines for data applications (Azure DevOps/GitHub Actions)
- Drive cloud migrations (Azure Data Factory/Google Cloud Platform Dataflow to Snowflake)
- Provide L3 support and knowledge transfer to engineering teams
Key Skills:
- 14+ years in data engineering with 5+ years focused on Snowflake implementations
- Expert-level skills in Snowflake architecture (time travel, zero-copy cloning, resource monitoring)
- Advanced SQL (query tuning, window functions, stored procedures)
- PySpark optimizations (partitioning, broadcast joins, Delta Lake)
- Proven experience with Azure/Google Cloud Platform data services (Synapse/Dataform, BigQuery, Cloud Composer)
- Data orchestration (Airflow, Dagster)
- Infrastructure-as-code (Terraform, ARM templates)
- Strong Python coding (unit testing, logging, packaging)
- Experience resolving critical production issues (e.g., "Describe a time you debugged a Snowflake warehouse timeout during month-end close")
- Ability to translate business needs to technical specs (insurance/finance domain preferred)
Salary (Rate): undetermined
City: Jersey City
Country: USA
Working Arrangements: hybrid
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Job Title: Principal Data Engineer
Location: Jersey City, NJ (Hybrid)
Duration: 12+ Months
Job Type: W2 Only
About the Role
We are seeking an accomplished Principal Data Engineer with deep expertise in Snowflake and cloud data platforms (Azure/Google Cloud Platform) to architect and implement enterprise-scale data solutions. You will lead complex data initiatives, optimize our modern data stack, and mentor engineering teams while solving cutting-edge data challenges.
Key Responsibilities
- Architect and implement high-performance Snowflake solutions (RBAC, CDC, query optimization)
- Design scalable ETL frameworks using Spark, Python, and cloud-native services
- Lead end-to-end data projects from ingestion to consumption (8-10 member teams)
- Solve complex data challenges e.g., "How would you optimize a slowly changing dimension process handling 10TB daily in Snowflake?"
- Implement data quality frameworks with profiling, STTM, and reusable validation modules
- Build CI/CD pipelines for data applications (Azure DevOps/GitHub Actions)
- Drive cloud migrations (Azure Data Factory/Google Cloud Platform Dataflow to Snowflake)
- Provide L3 support and knowledge transfer to engineering teams
Technical Requirements
- 14+ years in data engineering with 5+ years focused on Snowflake implementations
- Expert-level skills in:
- Snowflake architecture (time travel, zero-copy cloning, resource monitoring)
- Advanced SQL (query tuning, window functions, stored procedures)
- PySpark optimizations (partitioning, broadcast joins, Delta Lake)
- Proven experience with:
- Azure/Google Cloud Platform data services (Synapse/Dataform, BigQuery, Cloud Composer)
- Data orchestration (Airflow, Dagster)
- Infrastructure-as-code (Terraform, ARM templates)
- Strong Python coding (unit testing, logging, packaging)
- Experience resolving critical production issues (e.g., "Describe a time you debugged a Snowflake warehouse timeout during month-end close")
- Ability to translate business needs to technical specs (insurance/finance domain preferred)
Preferred Qualifications
- Snowflake certifications (SnowPro Advanced)
- Experience with Apache Iceberg and lakehouse architectures
- Knowledge of insurance claims/loss data models