SC Cleared Data Engineer - AWS/ETL

SC Cleared Data Engineer - AWS/ETL

Posted 4 days ago by fortice

£385 Per day
Undetermined
Remote
Remote, UK

Summary: The role of SC Cleared Data Engineer focuses on developing and optimizing ETL pipelines using AWS technologies. Candidates are expected to build and manage cloud-based data pipelines while collaborating with various stakeholders to deliver reliable data solutions. The position requires strong hands-on experience with AWS services and data processing workflows. Additionally, familiarity with best practices in data engineering is essential for success in this role.

Key Responsibilities:

  • Develop, maintain, and optimize ETL pipelines using AWS Glue (Informatica will be beneficial)
  • Build and manage cloud-based data pipelines leveraging AWS services (eg, EMR, S3, Lambda, Glue).
  • Implement scalable data processing workflows using Databricks, PySpark, Python, and SQL.
  • Design and support data ingestion, transformation, and integration processes across structured and unstructured data sources.
  • Collaborate with data architects, analysts, and business stakeholders to understand requirements and deliver reliable data solutions.
  • Monitor pipeline performance, troubleshoot issues, and ensure data quality and reliability.
  • Contribute to best practices for data engineering, including version control, CI/CD, and automation.

Key Skills:

  • Strong hands-on experience with ETL development and orchestration (AWS).
  • Solid AWS cloud experience, including working with core data services.
  • Expertise in building distributed data pipelines using EMR, PySpark, or similar technologies.
  • Strong data processing and transformation experience across large datasets.
  • Proficiency in PySpark, Python, and SQL for data manipulation and automation.
  • Understanding of data modelling, data warehousing concepts, and performance optimization.
  • Familiarity with CI/CD tools (DevOps, GitHub, GitLab).
  • Exposure to data governance, metadata management, and data quality frameworks.
  • Experience working in Agile environments is a plus.

Salary (Rate): £385/day

City: undetermined

Country: UK

Working Arrangements: remote

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Looking for experienced candidates in AWS

Key Responsibilities

  • Develop, maintain, and optimize ETL pipelines using AWS Glue (Informatica will be beneficial)
  • Build and manage cloud-based data pipelines leveraging AWS services (eg, EMR, S3, Lambda, Glue).
  • Implement scalable data processing workflows using Databricks, PySpark, Python, and SQL.
  • Design and support data ingestion, transformation, and integration processes across structured and unstructured data sources.
  • Collaborate with data architects, analysts, and business stakeholders to understand requirements and deliver reliable data solutions.
  • Monitor pipeline performance, troubleshoot issues, and ensure data quality and reliability.
  • Contribute to best practices for data engineering, including version control, CI/CD, and automation.

Required Skills & Qualifications

  • Strong hands-on experience with ETL development and orchestration (AWS).
  • Solid AWS cloud experience, including working with core data services.
  • Expertise in building distributed data pipelines using EMR, PySpark, or similar technologies.
  • Strong data processing and transformation experience across large datasets.
  • Proficiency in PySpark, Python, and SQL for data manipulation and automation.
  • Understanding of data modelling, data warehousing concepts, and performance optimization.
  • Familiarity with CI/CD tools (DevOps, GitHub, GitLab).
  • Exposure to data governance, metadata management, and data quality frameworks.
  • Experience working in Agile environments is a plus.