Negotiable
Outside
Remote
USA
Summary: The role of Sr Python Data Engineer involves leading the migration of enterprise batch processing workflows from AutoSys to Airflow. The engineer will work closely with Batch and Core Online teams to ensure a seamless transition while maintaining performance integrity and operational continuity. Key responsibilities include job analysis, conversion, validation, and deployment, as well as collaboration with application owners and monitoring migration progress. The position requires strong technical skills in Apache Airflow, Python, and AWS environments.
Key Responsibilities:
- Lead the end-to-end migration of production AutoSys jobs to Airflow, including job analysis, conversion, validation, and deployment.
- Collaborate with application owners to baseline existing AutoSys jobs and define Airflow DAGs aligned with business logic.
- Utilize and refine the internal conversion tool for AutoSys-to-Airflow job translation, ensuring accuracy and maintainability.
- Coordinate with the TU teams to access code repositories, AWS environments, and test data for regression and performance testing.
- Support the development and execution of test suites for TUBS, Phoenix, and EIR applications.
- Monitor and report on migration progress, risks, and blockers, ensuring timely delivery and stakeholder alignment.
- Participate in code freeze planning and change control processes during the migration lifecycle.
Key Skills:
- Strong experience with Apache Airflow and AutoSys job scheduling.
- Proficiency in Python for DAG development and scripting.
- Familiarity with AWS environments and CI/CD pipelines.
- Experience with batch processing systems and performance testing.
- Excellent collaboration and communication skills across cross-functional teams.
Salary (Rate): undetermined
City: undetermined
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
We are seeking skilled engineers to lead the migration of enterprise batch processing workflows from AutoSys to Airflow. This role will support both the Batch and Core Online teams, ensuring seamless transition, performance integrity, and operational continuity across critical systems.
- Lead the end-to-end migration of production AutoSys jobs to Airflow, including job analysis, conversion, validation, and deployment.
- Collaborate with application owners to baseline existing AutoSys jobs and define Airflow DAGs aligned with business logic.
- Utilize and refine the internal conversion tool for AutoSys-to-Airflow job translation, ensuring accuracy and maintainability
- Coordinate with the TU teams to access code repositories, AWS environments, and test data for regression and performance testing
- Support the development and execution of test suites for TUBS, Phoenix, and EIR applications.
- Monitor and report on migration progress, risks, and blockers, ensuring timely delivery and stakeholder alignment.
- Participate in code freeze planning and change control processes during the migration lifecycle.
Required Skills:
- Strong experience with Apache Airflow and AutoSys job scheduling.
- Proficiency in Python for DAG development and scripting.
- Familiarity with AWS environments and CI/CD pipelines.
- Experience with batch processing systems and performance testing.
- Excellent collaboration and communication skills across cross-functional teams.
Preferred Qualifications:
- Prior experience in large-scale job migration or modernization projects.
- Exposure to enterprise data platforms and regulatory environments.
- Knowledge of tools like Git, Linux, and Harness is a plus.
Required Knowledge
In a recommended order
- JSAAS UI (Airflow)
- Autosys
- Harness (general knowledge)
- Git (general knowledge)
- Python (general knowledge)
- Linux (general knowledge)
- soupUI (general knowledge)
- SRE1 Server Access (pacman)