Negotiable
Outside
Remote
USA
Summary: We are seeking a skilled Data Engineer specializing in data ingestion, orchestration, and cloud-based platforms. The role focuses on building reliable data pipelines and managing modern data warehouses to facilitate scalable analytics. The position is fully remote and requires expertise in cloud data engineering. Sponsorship is not available for this role.
Key Responsibilities:
- Design and implement data ingestion pipelines from diverse sources (tables, files, APIs).
- Develop and maintain ETL/ELT workflows using orchestration tools such as Airflow, Dagster, ADF, or AWS Glue.
- Build event-driven solutions with Azure Functions or AWS Lambda.
- Work with cloud data warehouses like Snowflake or Google BigQuery for data modeling and analytics.
- Write and optimize Python/Java code and SQL queries for data transformation.
- Ensure data quality, governance, and performance across pipelines.
Key Skills:
- Strong experience in cloud data engineering (Azure or AWS).
- Proficiency with data orchestration frameworks.
- Hands-on expertise in Snowflake or BigQuery.
- Advanced Python/Java and SQL skills.
- Solid understanding of ETL processes and data warehousing concepts.
Salary (Rate): undetermined
City: undetermined
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Job Title: Data Engineer(Data Ingestion)
Interview: Virtual
Job Duration: 6+ Months
Job Location: 100% Remote
Note: Only W2 and no sponsorship
Job Summary:
We are looking for a skilled Data Engineer with strong expertise in large-scale data ingestion, orchestration, and cloud-based data platforms. The role involves building reliable pipelines, managing modern data warehouses, and leveraging cloud-native solutions to enable scalable analytics.
Key Responsibilities:
- Design and implement data ingestion pipelines from diverse sources (tables, files, APIs).
- Develop and maintain ETL/ELT workflows using orchestration tools such as Airflow, Dagster, ADF, or AWS Glue.
- Build event-driven solutions with Azure Functions or AWS Lambda.
- Work with cloud data warehouses like Snowflake or Google BigQuery for data modeling and analytics.
- Write and optimize Python/Java code and SQL queries for data transformation.
- Ensure data quality, governance, and performance across pipelines.
Required Skills:
- Strong experience in cloud data engineering (Azure or AWS).
- Proficiency with data orchestration frameworks.
- Hands-on expertise in Snowflake or BigQuery.
- Advanced Python/Java and SQL skills.
- Solid understanding of ETL processes and data warehousing concepts.