Negotiable
Undetermined
Remote
Remote or Illinois
Primary Responsibilities
- Design, develop, and maintain enterprise ETL pipelines using Azure Data Factory (ADF), Informatica PowerCenter, and Python-based frameworks
- Build and optimize scalable data processing solutions using Python, Spark, and Databricks
- Support Medicaid analytics and federal reporting initiatives (e.g., T-MSIS, PERM, MARS, Quality of Care)
- Develop robust data validation, reconciliation, and audit-traceable data pipelines
- Write and optimize SQL and stored procedures across relational platforms such asSnowflake, Oracle, and SQL Server
- Participate in cloud migration and modernization initiatives within Azure-based architectures
- Collaborate with analysts, QA, and reporting teams to ensure data quality, accuracy, and timeliness
- Follow data engineering best practices for performance, reliability, reusability, and security
- Support production operations, incident resolution, and root-cause analysis
- Participate in code reviews, source control, and CI/CD processes using Azure DevOps and GitHub
Required Qualifications
- 5+ years of data engineering experience with a focus on enterprise data warehousing
- 5+ years of hands-on ETL development using Informatica PowerCenter, Azure Data Factory, or similar tools
- 5+ years of Python development for data engineering and automation
- 3+ years of experience with Spark-based processing frameworks (Databricks or equivalent)
- Strong SQL expertise and experience with relational databases (such as Teradata,Snowflake, Oracle, SQL Server)
- Experience with source control and DevOps practices (Azure DevOps, GitHub, CI/CD)
- Bachelor's degree or higher in Computer Science, Engineering, Analytics, or a related field
- Strong analytical, problem-solving, and troubleshooting skills
Preferred Qualification
- Experience supporting State Medicaid EDW or MMIS analytics environments
- Healthcare or public-sector analytics experience (Medicaid / Medicare preferred)
- Data modeling experience in enterprise data warehouse environments
- Scripting experience (PowerShell, Bash) for automation and orchestration
- Experience designing or consuming APIs (REST) within data platforms
- Familiarity with data quality frameworks, reconciliation, and audit support
- Azure certifications related to data engineering or analytics