Negotiable
Outside
Remote
USA
Summary: U.S. Bank is seeking a skilled Java/Apache Flink Engineer to aid in the modernization of data movement and transformation systems, focusing on migrating legacy applications to a custom-built Apache Flink framework. The role involves hands-on engineering work within a small team, emphasizing real-time stream processing and software engineering best practices. Candidates must have solid experience with Apache Flink and be located near specified U.S. Bank hub cities. This position is remote but requires local residency in one of the hub cities.
Key Responsibilities:
- Develop, enhance, and maintain a custom Flink-based CDC framework
- Migrate legacy ETL/CDC pipelines to Flink-based data streams
- Integrate new data sources into existing Flink pipelines
- Collaborate with FTE engineer and contractor on tooling and pipeline modernization
- Support CI/CD automation and code quality efforts
- Participate in planning and design discussions for data streaming architecture
Key Skills:
- Apache Flink solid experience with Flink SQL and/or DataStream API
- Java professional experience in Java-based data engineering
- Real-time data pipeline development
- Cloud experience Azure preferred
- CI/CD pipeline experience Azure DevOps preferred
- GitHub for version control
- Strong engineering fundamentals
Salary (Rate): undetermined
City: undetermined
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Duration: 12+ Months
Must-Have: Hands-on experience with Apache Flink
- Develop, enhance, and maintain a custom Flink-based CDC framework
- Migrate legacy ETL/CDC pipelines to Flink-based data streams
- Integrate new data sources into existing Flink pipelines
- Collaborate with FTE engineer and contractor on tooling and pipeline modernization
- Support CI/CD automation and code quality efforts
- Participate in planning and design discussions for data streaming architecture
- Apache Flink solid experience with Flink SQL and/or DataStream API
- Java professional experience in Java-based data engineering
- Real-time data pipeline development
- Cloud experience Azure preferred
- CI/CD pipeline experience Azure DevOps preferred
- GitHub for version control
- Strong engineering fundamentals
- Apache Kafka experience working with Kafka topics and stream management
- Python expertise especially if Java experience is limited but Python skills are very strong
- Experience building AI-based solutions or agents (e.g., schema generation or metadata classification using LLMs like Azure OpenAI)
- Interest or experience with metadata management, schema registry, and Kafka topic lineage tracking
- Must be currently located in one of the listed hub cities (Minneapolis, Atlanta, Chicago, San Francisco, Dallas) no exceptions
- This is a remote role, but candidates must be local to one of the hub cities
- Flink experience is mandatory candidates without it will not be considered
- Team is building a Center of Excellence (CoE) around CDC pipelines using Flink opportunity to contribute to enterprise-scale systems
Follow us over Linkedin -