Spark-Scala Developer

Spark-Scala Developer

Posted Today by BlueRose Technologies

Negotiable
Undetermined
Hybrid
London Area, United Kingdom

Summary: The Spark-Scala Developer role is focused on designing and developing large-scale, distributed data processing pipelines using Apache Spark and Scala. The position requires strong programming skills in Scala, along with experience in various Big Data technologies. The developer will work in an Agile environment, collaborating with multiple stakeholders to deliver technical solutions that meet project quality standards. This is a 12-month contract based in London with a hybrid working arrangement.

Key Responsibilities:

  • Write clean, maintainable, and efficient Scala code following best practices.
  • Design and develop large scale, distributed data processing pipelines using Apache Spark.
  • Utilize Spark Core, Spark SQL, and Spark Streaming effectively.
  • Work with Hadoop, HDFS, Hive, and other Big Data technologies.
  • Implement Data warehousing and ETL concepts and techniques.
  • Perform SQL/NoSQL operations and understand database concepts.
  • Engage in project development life cycle activities and maintenance/support projects.
  • Participate in Agile practices including daily standups, sprint planning, and retrospectives.
  • Translate project requirements into technical solutions that meet quality standards.
  • Collaborate with diverse stakeholders to identify, troubleshoot, and resolve data issues.
  • Stay updated with new technologies and industry trends in development.

Key Skills:

  • Strong knowledge of Scala programming language.
  • Experience with Apache Spark and related technologies.
  • Familiarity with Hadoop, HDFS, Hive, and other Big Data technologies.
  • Understanding of Data Structures and their usage.
  • Knowledge of Data warehousing and ETL concepts.
  • Experience with SQL/NoSQL operations.
  • UNIX shell scripting skills.
  • Strong problem-solving and analytical skills.
  • Excellent verbal and written communication skills.
  • Experience in a Global delivery environment.

Salary (Rate): undetermined

City: London

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Job Title: Spark-Scala Developer

Location: London , UK (Hybrid, 3 days a week)

Job Type: 12 Months Contract

Must have skills: Spark & Scala

Nice to have skills: -Spark Streaming, Hadoop, -Hive, -SQL, -Sqoop, -Impala

Detailed Job Description:

  • experience and strong knowledge in Scala programming language.
  • Able to write clean, maintainable and efficient Scala code following best practices.
  • Good knowledge on the fundamental Data Structures and their usage
  • experience in designing and developing large scale, distributed data processing pipelines using Apache Spark and related technologies.
  • Having expertise in Spark Core, Spark SQL and Spark Streaming.
  • Experience with Hadoop, HDFS, Hive and other BigData technologies.
  • Familiarity with Data warehousing and ETL concepts and techniques
  • Having expertise in Database concepts and SQL/NoSQL operations.
  • UNIX shell scripting will be an added advantage in scheduling/running application jobs.
  • Project development life cycle activities and maintenance/support projects.
  • Work in an Agile environment and participation in scrum daily standups, sprint planning reviews and retrospectives.
  • Understand project requirements and translate them into technical solutions which meets the project quality standards
  • Ability to work in team in diverse/multiple stakeholder environment and collaborate with upstream/downstream functional teams to identify, troubleshoot and resolve data issues.
  • Strong problem solving and Good Analytical skills.
  • Excellent verbal and written communication skills.
  • Experience and desire to work in a Global delivery environment.
  • Stay up to date with new technologies and industry trends in Development.