Data Engineer (Google Cloud Platform expert)/Python and SQL Exp

Data Engineer (Google Cloud Platform expert)/Python and SQL Exp

Posted Today by 1761536108

Negotiable
Outside
Remote
USA

Summary: The Data Engineer role focuses on designing and optimizing analytical datasets, developing scalable data pipelines, and implementing Big Data solutions within Google Cloud Platform. The position requires collaboration with product managers and business teams to translate data requirements into technical solutions. Candidates should possess expert-level skills in Python and SQL, along with experience in cloud environments, particularly Google Cloud Platform. The role is remote and classified as outside IR35.

Key Responsibilities:

  • Design and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasets to support high-volume reporting, business analysis, and data science model development.
  • Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence, advanced analytics.
  • Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics.
  • Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions.

Key Skills:

  • Cloud Platform Expertise (Google Cloud Platform Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for Google Cloud Platform (Google Cloud Platform) services, specifically:
  • BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment.
  • Experience with other relevant Google Cloud Platform services like Cloud Storage, Cloud Dataflow/Beam, or Pub/Sub.
  • Programming & Querying:
  • Python: Expert-level programming proficiency in Python, including experience with relevant data engineering libraries.
  • SQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning.
  • Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Airflow, Dagster, or similar).
  • DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g., GitLab, GitHub Actions) to automate deployment and testing processes.

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Key Responsibilities

  • Design and Optimization: Design, implement, and optimize clean, well-structured, and performant analytical datasets to support high-volume reporting, business analysis, and data science model development.
  • Pipeline Development: Architect, build, and maintain scalable and robust data pipelines for diverse applications, including business intelligence, advanced analytics
  • Big Data & Streaming: Implement and support Big Data solutions for both batch (scheduled) and real-time/streaming analytics.
  • Collaboration: Work closely with product managers and business teams to understand data requirements and translate them into technical solutions.

Required Skills and Experience

  • Cloud Platform Expertise (Google Cloud Platform Focus): Extensive hands-on experience working in dynamic cloud environments, with a strong preference for Google Cloud Platform (Google Cloud Platform) services, specifically:

BigQuery: Expert-level skills in data ingestion, performance optimization, and data modeling within a petabyte-scale environment.

Experience with other relevant Google Cloud Platform services like Cloud Storage, Cloud Dataflow/Beam, or Pub/Sub

  • Programming & Querying:

Python: Expert-level programming proficiency in Python, including experience with relevant data engineering libraries

SQL: A solid command of advanced SQL for complex querying, data processing, and performance tuning.

  • Data Pipeline Orchestration: Prior experience using workflow management and orchestration tools (e.g., Apache Airflow, Cloud Composer, Airflow,Dagster, or similar).
  • DevOps/CI/CD: Experience with version control (Git) and familiarity with CI/CD practices and tools (e.g., GitLab, GitHub Actions) to automate deployment and testing processes.