Python Architect

Python Architect

Posted 1 day ago by Magicforce

Negotiable
Undetermined
Remote
Remote

Summary: Python Architect will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects.

Key Responsibilities:

  • Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads.
  • Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability.
  • Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions.
  • Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines.
  • Implement secure coding best practices and design patterns throughout the development lifecycle.
  • Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions.
  • Create and maintain technical documentation, including process/data flow diagrams and system design artifacts.
  • Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices.
  • Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks.
  • Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage.
  • Minimum of 7+ years overall IT experience.
  • Experienced in waterfall, iterative, and agile methodologies.

Key Skills:

  • Expertise in Python, PySpark, and Airflow.
  • Experience with CI/CD processes and tools such as GitHub and Docker.
  • Strong understanding of data engineering principles and practices.
  • Knowledge of test-driven development (TDD) and automation of tests.
  • Ability to implement secure coding practices.
  • Experience in collaborating with cross-functional teams.
  • Strong documentation skills.
  • Mentorship and leadership abilities.
  • Problem-solving skills for troubleshooting technical issues.
  • Minimum of 7+ years overall IT experience.
  • Familiarity with waterfall, iterative, and agile methodologies.

Salary (Rate): undetermined

City: undetermined

Country: undetermined

Working Arrangements: remote

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Python Architect will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects

Key Responsibilities
1. Data Engineering: Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads.
2. Workflow Orchestration: Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability.
3. CI/CD Pipeline Development: Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions.
4. Testing & Quality: Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines.
5. Secure Development: Implement secure coding best practices and design patterns throughout the development lifecycle.
6. Collaboration: Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions.
7. Documentation: Create and maintain technical documentation, including process/data flow diagrams and system design artifacts.
8. Mentorship: Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices.
9. Troubleshooting: Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks.

Cross-Team Knowledge Sharing: Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage.Includes all above skills, plus the following;
Minimum of 7+ years overall IT experience
Experienced in waterfall, iterative, and agile methodologies