Azure Data Architect with SAS

Azure Data Architect with SAS

Posted Today by 1761834339

Negotiable
Outside
Remote
USA

Summary: The Azure Data Architect with SAS role requires a seasoned professional with 12-15 years of experience in Data Engineering and Architecture. The position focuses on designing and maintaining data management systems, developing data strategies, and collaborating with stakeholders to deliver effective technical solutions. The candidate will also be responsible for implementing data pipelines and ensuring the architecture aligns with business requirements. This role is fully remote and classified as outside IR35.

Key Responsibilities:

  • Candidate should have 12-15 years of experience in Data Engineering and Architect area
  • Data platform strategy, Data migration strategy, Data validation strategy.
  • Designing, creating, testing and maintaining the complete data management & processing systems.
  • Working closely with the stakeholders
  • Contribute towards defining platform roadmap/Architecture/ solution, Design, POCs, prototype, technical evaluation for tech stack finalization and guiding principle for the best practices etc.
  • Contribute towards understanding the business problems, identify and propose the best technical solution.
  • Design / Implement DWH, data model, Data Pipelines/ Analytics; Handling NFR & benchmarking
  • Creating data models to reduce system complexities and hence increase efficiency & reduce cost.
  • Introducing new data management tools & technologies into the existing system to make it more efficient.
  • Ensuring architecture meets the business requirements.
  • Building highly scalable, robust & fault-tolerant systems.
  • Taking care of the complete ETL process.
  • Must have knowledge and working experience in Real-time processing Framework (Apache Spark), PySpark and in Azure
  • Must have experience on SQL-based technologies (e.g. MySQL/ Oracle DB) and NoSQL technologies (e.g. Cassandra and MongoDB)
  • Experience in Snowflake
  • Discovering data acquisitions opportunities
  • Finding ways & methods to find value out of existing data.
  • Improving data quality, reliability & efficiency of the individual components & the complete system.
  • Setting & achieving individual as well as the team goal.
  • Problem solving mindset working in agile environment

Key Skills:

  • 12-15 years of experience in Data Engineering and Architecture
  • Data platform strategy, Data migration strategy, Data validation strategy
  • Experience in designing, creating, testing, and maintaining data management systems
  • Knowledge of Apache Spark, PySpark, and Azure
  • Experience with SQL-based technologies (MySQL, Oracle DB) and NoSQL technologies (Cassandra, MongoDB)
  • Experience in Snowflake
  • Ability to create data models and implement data pipelines
  • Strong problem-solving skills in an agile environment
  • Experience in building scalable and fault-tolerant systems
  • Ability to improve data quality and efficiency

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Role: Azure Data Architect with SAS

Remote

Job Description:

  • Candidate should have 12-15 years of experience in Data Engineering and Architect area
  • Data platform strategy, Data migration strategy, Data validation strategy.
  • Designing, creating, testing and maintaining the complete data management & processing systems.
  • Working closely with the stakeholders
  • Contribute towards defining platform roadmap/Architecture/ solution, Design, POCs, prototype, technical evaluation for tech stack finalization and guiding principle for the best practices etc.
  • Contribute towards understanding the business problems, identify and propose the best technical solution.
  • Design / Implement DWH, data model, Data Pipelines/ Analytics; Handling NFR & benchmarking
  • Creating data models to reduce system complexities and hence increase efficiency & reduce cost.
  • Introducing new data management tools & technologies into the existing system to make it more efficient.
  • Ensuring architecture meets the business requirements.
  • Building highly scalable, robust & fault-tolerant systems.
  • Taking care of the complete ETL process.
  • Must have knowledge and working experience in Real-time processing Framework (Apache Spark), PySpark and in Azure
  • Must have experience on SQL-based technologies (e.g. MySQL/ Oracle DB) and NoSQL technologies (e.g. Cassandra and MongoDB)
  • Experience in Snowflake
  • Discovering data acquisitions opportunities
  • Finding ways & methods to find value out of existing data.
  • Improving data quality, reliability & efficiency of the individual components & the complete system.
  • Setting & achieving individual as well as the team goal.
  • Problem solving mindset working in agile environment