Cloud Data Engineer - Fivetran - REMOTE WORK -- 66948

Cloud Data Engineer - Fivetran - REMOTE WORK -- 66948

Posted 1 day ago by PRIMUS Global Services Inc.

Negotiable
Undetermined
Remote
Remote

Summary: The Cloud Data Engineer role focuses on implementing a technology framework for cloud computing, integration, and automation, specifically in designing end-to-end data integration solutions. The position requires collaboration with cross-functional teams to ensure data integrity and quality while providing technical guidance to junior engineers. This role is fully remote and emphasizes the use of tools like Python, SQL, and ETL/ELT technologies. Candidates should have substantial experience in data engineering and cloud platforms.

Key Responsibilities:

  • Drive scope definition, requirements analysis, data and technical design, pipeline build, product configuration, unit testing, and production deployment.
  • Design scalable ingestion processes to integrate various data sources into cloud infrastructure.
  • Design reusable assets, components, standards, frameworks, and processes for data integration projects.
  • Develop data integration and transformation jobs using Python, SQL, and ETL/ELT tools.
  • Build infrastructure for optimal extraction, transformation, and loading of data from diverse sources.
  • Build processes for data transformation, data structures, metadata, dependency, and workload management.
  • Design parameter-driven orchestration for change data capture and monitoring.
  • Develop and implement scripts for data process maintenance, monitoring, and performance tuning.
  • Test and document data processes through validation and verification procedures.
  • Collaborate with cross-functional teams to resolve data quality and operational issues.
  • Ensure solutions meet technical and functional/non-functional requirements and are delivered on time.
  • Provide technical guidance and mentorship to junior engineers.
  • Maintain knowledge of industry trends and technologies.

Key Skills:

  • Experience in data transformation and pipeline development using SQL and Python.
  • Bachelor's Degree in Computer Science or related field.
  • 5 years of experience in IT disciplines, including database management and cloud engineering.
  • 2 years of experience with cloud platforms and data integration.
  • Experience with data integration platforms/tools and optimizing data pipelines.
  • Advanced SQL knowledge and experience with relational databases.
  • Hands-on experience with modern ETL/ELT tools like Fivetran, HVR, dbt, Airflow.
  • Proficiency in Python and SQL for scripting and data transformation.
  • Experience in test automation for integrations and data flows.
  • Familiarity with DevOps tool chains and processes.
  • Understanding of Snowflake Data Cloud.

Salary (Rate): £44.00 hourly

City: undetermined

Country: undetermined

Working Arrangements: remote

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Cloud Data Engineer - Fivetran - REMOTE WORK - 66948

Pay Range - $50 - $55/hr

One of our clients is looking for a Cloud Data Engineer - Fivetran to join their team on a remote basis.

Primary Purpose:

Responsible for the implementation of a technology framework providing technical support of initiatives in cloud computing, integration, and automation, with a focus on the design of systems and services that run on cloud platforms. Primary focus will be to support design and development of end-to-end data integration solutions in cloud infrastructure using approved technologies. Contributes to the Cloud Data Engineering team effort to provide architecture, design support for data movement within cloud infrastructure. Additionally, will aid in ensuring the integrity, reliability and quality of that data services implemented in the platform.

Major Responsibilities:

Drive scope definition, requirements analysis, data and technical design, pipeline build, product configuration, unit testing, and production deployment.

Design scalable ingestion processes to bring on-prem, API drive, 3rd party, end user generated data sources to integrate in common cloud infrastructure.

Design reusable assets, components, standards, frameworks, and processes to accelerate and facilitate data integration projects.

Develop data integration and transformation jobs using Python, SQL and ETL /ELT tools.

Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources.

Build processes supporting data transformation, data structures, metadata, dependency and workload management.

Design parameter driven orchestration to allow for change data capture and monitoring.

Develop and implement scripts for data process maintenance, monitoring, and performance tuning.

Test and document data processes through data validation and verification procedures.

Collaborate with cross functional team to resolve data quality and operational issues.

Ensure delivered solutions meet/perform to technical and functional/non-functional requirements.

Ensure delivered solutions are realized in time frame committed.

Provide technical guidance and mentorship to junior engineers, ensuring best practices in data engineering.

Maintain overall industry knowledge on latest trends, technology, etc.

Licensure, Registration and/or Certification Required:

Must have experience in data transformation and data pipeline development using GUI based tools or programming languages like SQL and Python.

Education Required:

Bachelor's Degree in Computer Science or related field.

Experience Required:

Typically requires 5 years of experience in at least two IT disciplines, including database management, cloud engineering, data engineering and middleware technologies. Includes 2 years of work experience with cloud platforms, including experience with data integration, performance optimization, and platform administration

Knowledge, Skills & Abilities Required:

Experience defining, designing, and developing solutions with data integration platforms/tools

Proven experience building and optimizing data pipelines, and data sets.

Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.

Hands-on experience working with cloud based modern ETL/ELT tools and technologies like Fivetran, HVR, dbt, Airflow etc.

Proficiency in Python and SQL for scripting and building data transformation processes is preferred.

Experience in test automation with a focus on testing integrations, including APIs and data flows between enterprise systems.

Must have experience with DevOps tool chains and processes.

Understanding and exposure to Snowflake Data Cloud

For immediate consideration:

Ujjal
PRIMUS Global Services
Direct
Desk: x 268
Email: