Negotiable
Outside
Remote
USA
Summary: The Google Cloud Platform Data Engineer role is a fully remote position focused on building and maintaining data pipelines using Google Cloud tools. The ideal candidate will have extensive experience in data engineering, particularly with Google Cloud Platform, and will be responsible for developing data transformation scripts and supporting real-time data analytics. Collaboration with users to deliver validated datasets and ensuring privacy compliance in clean room environments are key aspects of the role. This position requires a strong understanding of retail media and campaign KPIs, along with expertise in ETL tools and API integrations.
Key Responsibilities:
- Build and maintain data pipelines using Google Cloud Platform (BigQuery, Dataflow, Composer)
- Develop Python scripts for data transformation and ingestion
- Support real-time data streaming and campaign analytics
- Collaborate with users to deliver clean, validated datasets
- Integrate with clean room environments and ensure privacy compliance
- Build data models (e.g., slowly changing dimensions) for campaign and customer journey analytics
- Google Cloud Platform to Databricks Connecting to Databricks for ML Use cases (Hosted in Google Cloud Platform)
- Enable real-time data ingestion and streaming (Kafka or similar)
Key Skills:
- 8 years in data engineering with strong current experience in Google Cloud Platform and related data tools
- Proficient in Python, SQL, and Google Cloud Platform tools (BigQuery, Dataflow, Composer)
- Experience with clean rooms (Google, Databricks)
- Familiarity with Kafka or similar streaming tools
- Strong understanding of retail media and campaign KPIs
- Expertise in ETL tools like Informatica or Talend
- API integrations experience (e.g., Facebook/Meta or similar engineering experience)
- Bonus: Experience with Databricks and Data/ML pipelines
Salary (Rate): undetermined
City: undetermined
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Google Cloud Platform Data Engineer
100% Remote
12+ Months
Please let me know these all details and send me your updated resume :-
Visa :-
Current location:-
Notice Period :-
Year of Graduation:-
LinkedIn Profile:-
Required Skills
8 years in data engineering with Strong current experience in Google Cloud Platform and related data tools detailed on the resume
Proficient in Python, SQL, and Google Cloud Platform tools (BigQuery, Dataflow, Composer)
Experience with clean rooms (Google, Databricks)
Familiarity with Kafka or similar streaming tools
Strong understanding of retail media and campaign KPIs
Expertise in ETL tools like Informatica or Talend
API Integrations experience e.g. Facebook/Meta or similar engineering experience
Bonus: Experience with Databricks and, Data/ ML pipelines
Key Responsibilities
- Build and maintain data pipelines using Google Cloud Platform (BigQuery, Dataflow, Composer)
- Develop Python scripts for data transformation and ingestion
- Support real-time data streaming and campaign analytics
- Collaborate with users to deliver clean, validated datasets
- Integrate with clean room environments and ensure privacy compliance
- Build data models (e.g., slowly changing dimensions) for campaign and customer journey analytics
- Google Cloud Platform to Databricks Connecting to Databricks for ML Use cases (Hosted in Google Cloud Platform)
- Enable real-time data ingestion and streaming (Kafka or similar)
Best Regards,
Somesh
Regional Manager (Econosoft, Inc)
W:+ EXT: 4411 | F:+ Email:
Website: | Hangout :