Negotiable
Outside
Remote
USA
Summary: The Data Engineer role involves designing, developing, and maintaining scalable data pipelines on AWS, with a focus on ETL processes and data quality. The position requires extensive experience with AWS services and proficiency in Python and SQL. This is a long-term contract opportunity based in Toronto, Canada, with remote working options available. The role emphasizes the importance of data governance and security standards across data platforms.
Key Responsibilities:
- Design, develop, and maintain scalable data pipelines on AWS using services such as Glue, EMR, Lambda, and Redshift.
- Implement ETL/ELT processes to ingest structured and unstructured data from multiple sources.
- Build and optimize data models, data lakes, and data warehouses to support analytics and reporting.
- Ensure data quality, security, and governance standards are met across all data platforms.
Key Skills:
- Strong experience (10+ Years) with AWS services: S3, Glue, Redshift, EMR, Athena, Lambda etc.
- Proficiency in Python, SQL, and data pipeline orchestration tools (e.g., Airflow, Step Functions).
- Solid understanding of data warehousing, data lake architectures, and ETL best practices.
- Experience with schema design, performance tuning, and data modeling.
- Knowledge of security, IAM, and compliance best practices in AWS.
Salary (Rate): undetermined
City: Toronto
Country: Canada
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Role: Data Engineer
Location: Toronto, Canada (Remote)
Type: 12+ Months contract
Client: ACI
Rate: CAD40 to CAD45/hr
Minimum 8+ years of experience as a data engineer.
Must have experience: AWS services, Glue, ETL, Python and SQL
Job Description
Design, develop, and maintain scalable data pipelines on AWS using services such as Glue, EMR, Lambda, and Redshift.
Implement ETL/ELT processes to ingest structured and unstructured data from multiple sources.
Build and optimize data models, data lakes, and data warehouses to support analytics and reporting.
Ensure data quality, security, and governance standards are met across all data platforms.
Skills Required (Sr., DE):
Strong experience (10+ Years) with AWS services: S3, Glue, Redshift, EMR, Athena, Lambda etc.
Proficiency in Python, SQL, and data pipeline orchestration tools (e.g., Airflow, Step Functions).
Solid understanding of data warehousing, data lake architectures, and ETL best practices.
Experience with schema design, performance tuning, and data modeling.
Knowledge of security, IAM, and compliance best practices in AWS.
Regards,
Karthick Ramasamy
Client Relationship Manager / Talent Acquisition
Rohnium Inc
MBE, DBE & SBE Certified Company
Direct:
Email:
Fax:
LinkedIn:
Website: