Negotiable
Outside
Remote
USA
Summary: The role of Senior Data bricks Developer requires extensive experience in Java and Spark, focusing on developing and optimizing data processing solutions. Candidates must possess active IRS Mbi clearance and have a strong background in big data technologies and programming. The position is remote and emphasizes collaboration in Agile environments. A bachelor's degree in a related field and over eight years of professional experience are essential for applicants.
Key Responsibilities:
- Develop and optimize data processing solutions using Java and Spark.
- Build ETL pipelines and ensure data cleansing, transformation, and enrichment.
- Utilize Spark UI for troubleshooting and performance tuning.
- Integrate with big data technologies such as Kafka, HDFS, and Azure Data Lake.
- Collaborate in Agile/Scrum environments and maintain strong documentation practices.
Key Skills:
- Bachelor's degree in Computer Science, Information Systems, or a related field.
- 8+ years of professional experience in relevant technical skills.
- Strong expertise in Java 8 or higher and functional programming.
- Proficient in Apache Spark Core, Spark SQL, and DataFrame/Dataset APIs.
- Experience with big data ecosystem tools like HDFS, Hive, and Kafka.
- Familiarity with CI/CD tools and deployment technologies like Kubernetes.
- Proficient in version control with Git and build tools like Maven or Gradle.
- Experience with unit testing frameworks such as JUnit or TestNG.
- Strong documentation and troubleshooting skills.
Salary (Rate): undetermined
City: undetermined
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
TITLE- Senior Data bricks Developer with Java and Spark Experience
LOCATION- remote
DURATION- long term
INTERVIEW- video
We need candidates with Active IRS Mbi clearance.
Required Skills & Qualifications:
- Bachelor s degree in Computer Science, Information Systems, or a related field.
- 8+ years of professional experience demonstrating the required technical skills and responsibilities listed:
Programming Language Proficiency
- Strong expertise in Java 8 or higher
- Experience with functional programming (Streams API, Lambdas)
- Familiarity with object-oriented design patterns and best practices
Apache Spark
- Proficient in Spark Core, Spark SQL, and DataFrame/Dataset APIs
- Understanding of RDDs and when to use them
- Experience with Spark Streaming or Structured Streaming
- Skilled in performance tuning and Spark job optimization
- Ability to use Spark UI for troubleshooting stages and tasks
Big Data Ecosystem
- Familiarity with HDFS, Hive, or HBase
- Experience integrating with Kafka, S3, or Azure Data Lake
- Comfort with Parquet, Avro, or ORC file formats
Data Processing and ETL
- Strong understanding of batch and real-time data processing paradigms
- Experience building ETL pipelines with Spark
- Proficient in data cleansing, transformation, and enrichment
DevOps / Deployment
- Experience with YARN, Kubernetes, or EMR for Spark deployment
- Familiarity with CI/CD tools like Jenkins or GitHub Actions
- Monitoring experience with Grafana, Prometheus, Datadog, or Spark UI logs
Version Control & Build Tools
- Proficient in Git
- Experience with Maven or Gradle
Testing
- Unit testing with JUnit or TestNG
- Experience with Mockito or similar mocking frameworks
- Data validation and regression testing for Spark jobs
Soft Skills / Engineering Practices
- Experience working in Agile/Scrum environments
- Strong documentation skills (Markdown, Confluence, etc.)
- Ability to debug and troubleshoot production issues effectively