Negotiable
Outside
Remote
USA
Summary: The DataOps Engineer/ETL Azure DevOps Engineer role focuses on ensuring platform efficiency and managing data operations without building models. Key responsibilities include troubleshooting, optimizing data pipelines, and supporting data connectivity while utilizing SQL and Python. The position requires strong technical skills in data engineering tools and a commitment to the company's values. Candidates must have a background in computer science or engineering and relevant experience in data roles.
Key Responsibilities:
- Manage and optimize data pipelines
- Troubleshoot failures and performance issues
- Collaborate with teams to ensure data connectivity and governance
- Conduct code reviews and unit testing
- Automate manual processes
- Present findings and solutions to large groups
- Support end-users with data access and training
Key Skills:
- SQL
- Python
- Snowflake (Data Engineering and DBA Admin)
- Monte Carlo (pipeline monitoring)
- Azure Cloud, Databricks, DBT, Git
- ELT tools (Azure Data Factory, Airflow)
- Performance tuning
- Cloud cost optimization
- Experience migrating to Snowflake
- Airline experience (not required)
Salary (Rate): undetermined
City: undetermined
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Role Focus
- Not responsible for building models (unlike Bart/Nathan s team)
- Responsibilities include:
- Ensuring platform efficiency
- Handling production alerts
- Data monitoring and operations
- Troubleshooting and optimization
- SQL and Python scripting
- Supporting data connectivity (e.g., firewalls, third-party integrations)
- Automation and streamlining inefficiencies
- Working with IT security and governance
Technical Skills Required
- Must-Have:
- SQL
- Python
- Snowflake (Data Engineering and DBA Admin)
- Monte Carlo (pipeline monitoring)
- Azure Cloud, Databricks, DBT, Git
- ELT tools (Azure Data Factory, Airflow)
- Preferred:
- Performance tuning
- Cloud cost optimization
- Experience migrating to Snowflake
- Airline experience (not required)
Responsibilities
- Manage and optimize data pipelines
- Troubleshoot failures and performance issues
- Collaborate with teams to ensure data connectivity and governance
- Conduct code reviews and unit testing
- Automate manual processes
- Present findings and solutions to large groups
- Support end-users with data access and training
Qualifications
- Bachelor s in Computer Science/Engineering or equivalent experience
- 5+ years in data roles
- Strong communication and problem-solving skills
- Must be legally eligible to work in the U.S.
- Must pass a background check and drug test
- Available for occasional travel (10%)
Culture Fit and Expectations
- Must align with JetBlue s values: Safety, Caring, Integrity, Passion, Fun
- May need to work flexible hours (nights/weekends)
- Expected to assist with light aircraft cleaning when traveling on JetBlue flights
- Use of ChatGPT or similar tools during the interview process will disqualify candidates
With Regards,
Saroj Duhan | US IT Recruiter
Naztec International Group LLC.
263 N Jog Road, West Palm Beach, FL 33413