Spark-Scala Developer

Spark-Scala Developer

Posted Today by Infoplus Technologies UK Limited

Negotiable
Inside
Hybrid
London Area, United Kingdom

Summary: The Spark-Scala Developer role requires a seasoned professional with over 8 years of experience in Scala programming and Apache Spark technologies. The position involves designing and developing large-scale data processing pipelines while collaborating in an Agile environment. The developer will also be responsible for translating project requirements into technical solutions and ensuring adherence to quality standards. This hybrid role is based in London, UK, requiring three days a week onsite presence.

Key Responsibilities:

  • Write clean, maintainable, and efficient Scala code following best practices.
  • Design and develop large-scale, distributed data processing pipelines using Apache Spark.
  • Collaborate with diverse stakeholders to identify, troubleshoot, and resolve data issues.
  • Participate in Agile practices including daily standups, sprint planning, and retrospectives.
  • Translate project requirements into technical solutions that meet quality standards.
  • Stay updated with new technologies and industry trends in development.

Key Skills:

  • 8+ years of experience in Scala programming and Apache Spark technologies.
  • Strong knowledge of Data Structures and their usage.
  • Experience with Hadoop, HDFS, Hive, and other Big Data technologies.
  • Familiarity with Data warehousing and ETL concepts.
  • Expertise in SQL/NoSQL operations and database concepts.
  • UNIX shell scripting skills.
  • Strong problem-solving and analytical skills.
  • Excellent verbal and written communication skills.
  • Experience in a Global delivery environment.

Salary (Rate): undetermined

City: London

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: inside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

· Job Title: Spark-Scala Developer

· Location: London, UK (Hybrid - 3 days a week onsite)

· Duration: 12 months (Inside IR35)

Job Description :

Must have skills:

  • Spark & Scala

Nice to have skills:

  • Spark Streaming
  • Hadoop
  • Hive
  • SQL
  • Sqoop
  • Impala

Detailed Job Description:

  • At least 8+ years of experience and strong knowledge in Scala programming language.
  • Able to write clean, maintainable and efficient Scala code following best practices.
  • Good knowledge on the fundamental Data Structures and their usage
  • At least 8+ years of experience in designing and developing large scale, distributed data processing pipelines using Apache Spark and related technologies.
  • Having expertise in Spark Core, Spark SQL and Spark Streaming.
  • Experience with Hadoop, HDFS, Hive and other BigData technologies.
  • Familiarity with Data warehousing and ETL concepts and techniques
  • Having expertise in Database concepts and SQL/NoSQL operations.
  • UNIX shell scripting will be an added advantage in scheduling/running application jobs.
  • At least 8 years of experience in Project development life cycle activities and maintenance/support projects.
  • Work in an Agile environment and participation in scrum daily standups, sprint planning reviews and retrospectives.
  • Understand project requirements and translate them into technical solutions which meets the project quality standards
  • Ability to work in team in diverse/multiple stakeholder environment and collaborate with upstream/downstream functional teams to identify, troubleshoot and resolve data issues.
  • Strong problem solving and Good Analytical skills.
  • Excellent verbal and written communication skills.
  • Experience and desire to work in a Global delivery environment.
  • Stay up to date with new technologies and industry trends in Development.