Negotiable
Outside
Remote
USA
Summary: The Architect (Azure Data Warehouse Developer) role is a senior-level position focused on the design, development, and implementation of Azure-based data solutions. The candidate will collaborate with various stakeholders to ensure project objectives are met while leveraging advanced knowledge in data warehousing and programming. Key responsibilities include managing project timelines, conducting requirements gathering, and ensuring compliance with standards. The role requires significant hands-on experience with Azure technologies and data engineering practices.
Key Responsibilities:
- Design, develop, test, and implement data lakes, databases, extract-load-transform programs, applications, and reports.
- Manage assignments and track progress against agreed upon timelines.
- Plan, organize, prioritize, and manage work efforts, coordinating with the EDW and other teams.
- Participate in status reviews, process reviews, deliverable reviews, and software quality assurance work product reviews with the appropriate stakeholders.
- Participate in business and technical requirements gathering.
- Perform research on potential solutions and provide recommendations to the EDW and DOH.
- Develop and implement solutions that meet business and technical requirements.
- Participate in testing of implemented solution(s).
- Build and maintain relationships with key stakeholders and customer representatives.
- Give presentations for the EDW, other DOH offices, and agencies involved with this project.
- Develops and maintains processes and procedural documentation.
- Ensure project compliance with relative federal and commonwealth standards and procedures.
- Conduct training and transfer of knowledge sessions for system and code maintenance.
- Complete weekly timesheet reporting in PeopleFluent/VectorVMS by COB each Friday.
- Complete weekly project status updates in Daptiv if necessary.
- Provide weekly personal status reporting by COB Friday submitted on SharePoint.
- Utilize a SharePoint site for project and operational documentation; review existing documentation.
- The Architect can design, develop, and implement data and ELT application infrastructure in Azure.
- The candidate must have significant, hands-on technical experience and expertise with Azure technologies.
- Experience producing ETL/ELT using SQL Server Integration Services and other tools.
- Experience with SQL Server, T-SQL, scripts, queries.
- Experience as an Azure DevOps CI/CD Pipeline Release Manager.
- Experience with data formatting, capture, search, retrieval, extraction, classification, quality control, cleansing, and information filtering techniques.
- Experience with data mining architecture, modeling standards, reporting and data analysis methodologies.
- Experience with data engineering, database file systems optimization, APIs, and analytics as a service.
- Analyzing and translating business requirements and use cases into optimized designs.
- Advanced knowledge of relational databases, dimensional databases, entity relationships, data warehousing, facts, dimensions, and star schema concepts and terminology.
- Creates and maintains technical documentation, diagrams, flowcharts, instructions, manuals, test plans, and test cases.
- Ability to balance work between multiple projects and possess good organizational skills.
- Demonstrated ability to communicate and document clearly and concisely.
- Ability to work collaboratively and effectively with colleagues as a member of a team.
- Ability to present complex technical concepts and data to a varied audience effectively.
- More than 5 years of relevant experience.
- 4-year college degree in computer science or related field with advanced study preferred.
Key Skills:
- Advanced knowledge and experience in data warehousing, database, and programming concepts.
- Significant hands-on technical experience with Azure, Azure Delta Lake, Azure Databricks, Azure Data Factory, Pipelines, Apache Spark, and Python.
- Expertise in SQL Server and Azure Synapse.
- Experience with SQL Server Integration Services for ETL/ELT processes.
- Experience with Azure DevOps CI/CD Pipeline management.
- Strong analytical and problem-solving skills.
- Excellent communication and documentation skills.
- Ability to work collaboratively in a team environment.
- Organizational skills to manage multiple projects.
- 4-year college degree in computer science or related field.
Salary (Rate): undetermined
City: Harrisburg
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: Senior
Industry: IT
- The Architect is a senior level resource with advanced, specialized knowledge and experience in data warehousing, database, and programming concepts and technology. The selected contractor must have proven experience in the development, maintenance, testing, and maintenance of Azure production systems and projects. This position designs, develops, tests, and implements data lakes, databases, extract-load-transform programs, applications, and reports. This position will work with business analysts, application developers, DBAs, network, and system staff to achieve project objectives - delivery dates, cost objectives, quality objectives, and program area customer satisfaction objectives.
- Manage assignments and track progress against agreed upon timelines.
- Plan, organize, prioritize, and manage work efforts, coordinating with the EDW and other teams.
- Participate in status reviews, process reviews, deliverable reviews, and software quality assurance work product reviews with the appropriate stakeholders.
- Participate in business and technical requirements gathering.
- Perform research on potential solutions and provide recommendations to the EDW and DOH.
- Develop and implement solutions that meet business and technical requirements.
- Participate in testing of implemented solution(s).
- Build and maintain relationships with key stakeholders and customer representatives.
- Give presentations for the EDW, other DOH offices, and agencies involved with this project.
- Develops and maintains processes and procedural documentation.
- Ensure project compliance with relative federal and commonwealth standards and procedures.
- Conduct training and transfer of knowledge sessions for system and code maintenance.
- Complete weekly timesheet reporting in PeopleFluent/VectorVMS by COB each Friday.
- Complete weekly project status updates in Daptiv if necessary. This will be dependent on a project being entered in Daptiv.
- Provide weekly personal status reporting by COB Friday submitted on SharePoint.
- Utilize a SharePoint site for project and operational documentation; review existing documentation.
- The Architect can design, develop, and implement data and ELT application infrastructure in Azure to provide reliable and scalable applications and systems to meet the organization s objectives and requirements. The Architect is familiar with a variety of application and database technologies, environments, concepts, methodologies, practices, and procedures
- The candidate must have significant, hands-on technical experience and expertise with Azure, Azure Delta Lake, Azure Databricks, Azure Data Factory, Pipelines, Apache Spark, and Python.
- Significant, hands-on technical experience and expertise with the design, implementation and maintenance of business intelligence and data warehouse solutions, with expertise in using SQL Server and Azure Synapse.
- Experience producing ETL/ELT using SQL Server Integration Services and other tools.
- Experience with SQL Server, T-SQL, scripts, queries.
- Experience as an Azure DevOps CI/CD Pipeline Release Manager who can design, implement, and maintain robust and scalable CI/CD pipelines, automate the build, test, and deployment processes for various applications and services, troubleshoot and resolve pipeline issues and bottlenecks, and has experience with Monorepo-based CI/CD pipelines
- Experience with data formatting, capture, search, retrieval, extraction, classification, quality control, cleansing, and information filtering techniques.
- Experience with data mining architecture, modeling standards, reporting and data analysis methodologies.
- Experience with data engineering, database file systems optimization, APIs, and analytics as a service.
- Analyzing and translating business requirements and use cases into optimized designs and developing sound solutions.
- Advanced knowledge of relational databases, dimensional databases, entity relationships, data warehousing, facts, dimensions, and star schema concepts and terminology.
- Creates and maintains technical documentation, diagrams, flowcharts, instructions, manuals, test plans, and test cases. Follows established SDLC best practices, documents code and participates in peer code reviews.
- Ability to balance work between multiple projects and possess good organizational skills, with minimal or no direct supervision.
- Demonstrated ability to communicate and document clearly and concisely
- Ability to work collaboratively and effectively with colleagues as a member of a team.
- Ability to present complex technical concepts and data to a varied audience effectively.
- More than 5 years of relevant experience.
- 4-year college degree in computer science or related field with advanced study preferred.
- Experience working in the public health or healthcare industry with various health data sets.