Google Cloud Platform Data Modeler

Google Cloud Platform Data Modeler

Posted Today by 1758180053

Negotiable
Outside
Remote
USA

Summary: The Google Cloud Platform Data Modeler role involves designing, building, and optimizing data models and structures on Google Cloud Platform. The position requires collaboration with data engineers, analysts, and business stakeholders to ensure that data models meet reporting, analytics, and real-time processing needs. Candidates must have hands-on experience with Google Cloud Platform and data modeling techniques. This is a long-term contract position available to W2 candidates only.

Key Responsibilities:

  • Design and develop conceptual, logical, and physical data models for Google Cloud Platform-based solutions.
  • Implement and optimize data structures for BigQuery, Dataflow, Pub/Sub, Dataproc, and Cloud Storage.
  • Support data warehouse and data mart design for analytical and reporting use cases.
  • Work with data engineers to ensure efficient ETL/ELT pipelines aligned with data models.
  • Translate business requirements into scalable data models that support analytics and operations.
  • Define naming conventions, standards, and metadata management practices.
  • Optimize BigQuery tables, partitioning, clustering, and query performance.
  • Ensure data integrity, consistency, and governance across different layers.
  • Collaborate with business users, analysts, and architects to validate data modeling requirements.

Key Skills:

  • Strong experience as a Data Modeler / Data Engineer with hands-on Google Cloud Platform exposure.
  • Proficiency in data modeling techniques (3NF, star schema, snowflake schema, data vault).
  • Expertise in BigQuery data modeling, partitioning, clustering, and performance tuning.
  • Good understanding of ETL/ELT pipelines and integration with Google Cloud Platform services.
  • Experience in data warehousing concepts, dimensional modeling, and OLAP systems.
  • Proficiency in SQL and familiarity with Python or scripting languages.
  • Knowledge of NoSQL modeling (Bigtable, Firestore, MongoDB) is a plus.
  • Strong understanding of data governance, lineage, and metadata management.
  • Excellent communication and collaboration skills.

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Job Title: Google Cloud Platform Data Modeler
Location: Remote in Untied States
Duration: Long Term Contract
Candidates who are willing to work on our W2 would be eligible. (W2 Candidates only)
All Visa including s, s, TN Visa, and L2are eligible.

Job Summary:
We are looking for a skilled Google Cloud Platform Data Modeler to design, build, and optimize data models and data structures on Google Cloud Platform. The role involves working closely with data engineers, analysts, and business stakeholders to ensure data models support reporting, analytics, and real-time processing needs.

Key Responsibilities:

  • Design and develop conceptual, logical, and physical data models for Google Cloud Platform-based solutions.
  • Implement and optimize data structures for BigQuery, Dataflow, Pub/Sub, Dataproc, and Cloud Storage.
  • Support data warehouse and data mart design for analytical and reporting use cases.
  • Work with data engineers to ensure efficient ETL/ELT pipelines aligned with data models.
  • Translate business requirements into scalable data models that support analytics and operations.
  • Define naming conventions, standards, and metadata management practices.
  • Optimize BigQuery tables, partitioning, clustering, and query performance.
  • Ensure data integrity, consistency, and governance across different layers.
  • Collaborate with business users, analysts, and architects to validate data modeling requirements.

Required Skills & Qualifications:

  • Strong experience as a Data Modeler / Data Engineer with hands-on Google Cloud Platform exposure.
  • Proficiency in data modeling techniques (3NF, star schema, snowflake schema, data vault).
  • Expertise in BigQuery data modeling, partitioning, clustering, and performance tuning.
  • Good understanding of ETL/ELT pipelines and integration with Google Cloud Platform services.
  • Experience in data warehousing concepts, dimensional modeling, and OLAP systems.
  • Proficiency in SQL and familiarity with Python or scripting languages.
  • Knowledge of NoSQL modeling (Bigtable, Firestore, MongoDB) is a plus.
  • Strong understanding of data governance, lineage, and metadata management.
  • Excellent communication and collaboration skills.

Preferred Qualifications:

  • Google Cloud Platform Professional Data Engineer Certification.
  • Experience with streaming data (Pub/Sub, Kafka, Dataflow).
  • Exposure to BI/Reporting tools (Looker, Tableau, Power BI).
  • Familiarity with Agile/Scrum methodologies.