Negotiable
Outside
Hybrid
USA
Summary: The Google Cloud Platform Data Engineer role involves designing, developing, and optimizing scalable data platforms and pipelines using Google Cloud services. The position requires extensive experience in data engineering, particularly with tools like BigQuery and Dataflow, and emphasizes collaboration with stakeholders to deliver data solutions. The ideal candidate will also mentor junior engineers and implement best practices in data reliability and performance. This is a contract position based in Chicago, IL, with a hybrid working arrangement.
Key Responsibilities:
- Design, build, and maintain data ingestion, transformation, and processing pipelines using Google Cloud Platform services.
- Develop and operate scalable distributed data systems leveraging DataProc, Dataflow, BigQuery, Cloud Spanner, Cloud SQL, Pub/Sub, and Cloud Storage.
- Build and support batch and streaming data workflows, including the development of API interfaces for data access and integration.
- Collaborate with data scientists, analysts, and business stakeholders to understand requirements and deliver data solutions that align with business goals.
- Ensure data quality, governance, and compliance with industry and organizational standards.
- Troubleshoot, optimize, and enhance data pipeline performance and reliability.
- Implement logging, monitoring, and alerting mechanisms for data jobs and infrastructure.
- Drive adoption of Infrastructure as Code (IaC) practices using tools like Terraform for consistent and automated deployments.
- Mentor junior engineers, enforce coding best practices, and maintain high standards for data reliability and performance.
Key Skills:
- 8+ years of hands-on experience in data engineering, preferably within Google Cloud Platform environments.
- Proficiency in Python, Java, or Scala, with strong SQL programming expertise.
- Proven experience building and managing distributed data processing systems using Google Cloud Platform tools such as BigQuery, Dataflow, DataProc, Pub/Sub, Cloud Spanner, and Cloud Storage.
- Strong understanding of ETL/ELT workflows, data architecture, and data warehousing principles.
- Experience designing real-time and batch data pipelines and integrating with API-based systems.
- Familiarity with Terraform or other IaC tools for automation and environment management.
- Exposure to CI/CD pipelines, data observability, and performance optimization practices.
- Excellent analytical, communication, and leadership skills.
Salary (Rate): undetermined
City: Chicago
Country: USA
Working Arrangements: hybrid
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
- We are seeking an experienced Google Cloud Platform Data Engineer to design, develop, and optimize scalable data platforms and pipelines on Google Cloud. The ideal candidate will have hands-on expertise with Google Cloud Platform-native tools, strong programming skills, and the ability to deliver efficient data solutions that drive business intelligence and analytics initiatives.
- Design, build, and maintain data ingestion, transformation, and processing pipelines using Google Cloud Platform services.
- Develop and operate scalable distributed data systems leveraging DataProc, Dataflow, BigQuery, Cloud Spanner, Cloud SQL, Pub/Sub, and Cloud Storage.
- Build and support batch and streaming data workflows, including the development of API interfaces for data access and integration.
- Collaborate with data scientists, analysts, and business stakeholders to understand requirements and deliver data solutions that align with business goals.
- Ensure data quality, governance, and compliance with industry and organizational standards.
- Troubleshoot, optimize, and enhance data pipeline performance and reliability.
- Implement logging, monitoring, and alerting mechanisms for data jobs and infrastructure.
- Drive adoption of Infrastructure as Code (IaC) practices using tools like Terraform for consistent and automated deployments.
- Mentor junior engineers, enforce coding best practices, and maintain high standards for data reliability and performance.
- 8+ years of hands-on experience in data engineering, preferably within Google Cloud Platform environments.
- Proficiency in Python, Java, or Scala, with strong SQL programming expertise.
- Proven experience building and managing distributed data processing systems using Google Cloud Platform tools such as BigQuery, Dataflow, DataProc, Pub/Sub, Cloud Spanner, and Cloud Storage.
- Strong understanding of ETL/ELT workflows, data architecture, and data warehousing principles.
- Experience designing real-time and batch data pipelines and integrating with API-based systems.
- Familiarity with Terraform or other IaC tools for automation and environment management.
- Exposure to CI/CD pipelines, data observability, and performance optimization practices.
- Excellent analytical, communication, and leadership skills.
- Experience with Docker, Kubernetes, or other container orchestration tools.
- Knowledge of machine learning data pipelines or data lakehouse architectures.
- Background in mentoring or technical leadership roles within data engineering teams.
.