$60/hr. max :: Snowflake Data Architect :: San Jose, CA (Remote Role) :: Contract Opportunity

$60/hr. max :: Snowflake Data Architect :: San Jose, CA (Remote Role) :: Contract Opportunity

Posted 1 day ago by 1765433010

Negotiable
Outside
Remote
USA

Summary: The Snowflake Data Architect role involves designing, developing, and maintaining data pipelines and workflows in Snowflake, utilizing tools such as DBT and Python. The position requires collaboration with cross-functional teams to deliver comprehensive data solutions while ensuring data quality and performance optimization. This is a contract-to-hire opportunity based in San Jose, CA, with remote working arrangements. Candidates must have significant experience in data engineering and Snowflake technologies.

Key Responsibilities:

  • Design, develop, and maintain data pipelines and ELT workflows in Snowflake using DBT, Python, and other tools.
  • Implement and optimize data models, Snowflake schemas, and SQL transformations for scalable analytics solutions.
  • Develop and manage user-defined functions (UDFs), stored procedures, and SnowSQL scripts for automation and advanced data processing.
  • Integrate data from multiple sources (e.g., Oracle, Teradata, APIs, streaming platforms) into Snowflake with high performance and reliability.
  • Optimize query performance, storage usage, and compute resources in Snowflake.
  • Implement data quality, monitoring, and governance best practices.
  • Collaborate with cross-functional teams including Data Analysts, Architects, and BI developers to deliver robust end-to-end data solutions.

Key Skills:

  • 4-10 years of overall experience in Data Engineering.
  • Strong hands-on experience in Snowflake including data load, schema design, performance tuning, and SnowSQL scripting.
  • Strong programming experience in Python for data ingestion, transformation, and automation.
  • Proficiency in DBT (Data Build Tool) for modeling, transformations, and workflow orchestration.
  • Excellent command over SQL (complex queries, optimization, window functions, etc.).
  • Experience working with large-scale data sets and performance optimization techniques.

Salary (Rate): £60 hourly

City: San Jose

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Hi, I hope you are doing well.
This is Akshat Gupta from EdgeAll Consulting. This email is a reference to jobs of Snowflake Data Architect go through the job descriptions and let us know if you are interested in the same. Please reply with your updated resume and expected compensation. Feel free to contact me at email for more information.

Position- Snowflake Data Architect

Location- San Jose, CA (Remote Role)

Interview- Phone and Skype

Duration- Contract-to-Hire Opportunity

Visa- All visas are acceptable

Key Responsibilities:
Design, develop, and maintain data pipelines and ELT workflows in Snowflake using DBT, Python, and other tools.
Implement and optimize data models, Snowflake schemas, and SQL transformations for scalable analytics solutions.
Develop and manage user-defined functions (UDFs), stored procedures, and SnowSQL scripts for automation and advanced data processing.
Integrate data from multiple sources (e.g., Oracle, Teradata, APIs, streaming platforms) into Snowflake with high performance and reliability.
Optimize query performance, storage usage, and compute resources in Snowflake.
Implement data quality, monitoring, and governance best practices.
Collaborate with cross-functional teams including Data Analysts, Architects, and BI developers to deliver robust end-to-end data solutions.
Required Skills:
4 10 years of overall experience in Data Engineering.
Strong hands-on experience in Snowflake including data load, schema design, performance tuning, and SnowSQL scripting.
Strong programming experience in Python for data ingestion, transformation, and automation.
Proficiency in DBT (Data Build Tool) for modeling, transformations, and workflow orchestration.
Excellent command over SQL (complex queries, optimization, window functions, etc.).
Experience working with large-scale data sets and performance optimization techniques.
Good to Have:
Exposure to real-time data ingestion frameworks (Kafka, Kinesis, Spark Streaming, etc.) and streaming analytics.
Experience with ETL/ELT tools (Informatica, Talend, Airflow, etc.).
Understanding of data warehousing best practices and cloud platforms (AWS, Azure, Google Cloud Platform).
Knowledge of JavaScript for Snowflake stored procedures.
Soft Skills:
Strong problem-solving and analytical mindset.
Excellent communication and collaboration skills.
Ability to work in a fast-paced, customer-focused environment.

Thanks and Regards,
Akshat Gupta
940 Saratoga Ave Suite #207, San Jose, CA 95129, United States

A MBE, SBE, WBE & WOSB Certified and an E-Verify Company