Automation Quality Engineer- Test Automation, EHR, Selenium, SQL, Hive, Pyspark, Snowflake, Kafka, Python, R, Spark, Scala, BDD,TDD-Remote-Must need Client round F2F interview

Automation Quality Engineer- Test Automation, EHR, Selenium, SQL, Hive, Pyspark, Snowflake, Kafka, Python, R, Spark, Scala, BDD,TDD-Remote-Must need Client round F2F interview

Posted 3 days ago by 1754718291

Negotiable
Outside
Remote
USA

Summary: The Automation Quality Engineer role focuses on test automation using various technologies including Selenium, SQL, and Python, with a strong emphasis on Python scripting expertise. The position requires extensive experience in the Hadoop ecosystem and related tools, and it is fully remote. Candidates must be prepared for a client-facing interview process.

Key Responsibilities:

  • Develop and implement test automation frameworks using Selenium and Python.
  • Utilize Hadoop ecosystem components such as HIVE, Pyspark, and Spark for data processing.
  • Write and optimize complex SQL queries and Hive/Impala queries.
  • Work with AWS services including Lambda and EMR for data pipelines.
  • Collaborate with cross-functional teams to ensure quality in software development.

Key Skills:

  • 10+ years of experience in automation testing and Python scripting.
  • Strong knowledge of Hadoop ecosystem components: HIVE, Pyspark, HDFS, Spark, Scala, Snowflake.
  • Experience with streaming technologies such as Kinesis and Kafka.
  • Proficient in writing complex SQL queries.
  • Familiarity with AWS services and data pipeline management.

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Automation Quality Engineer- Test Automation, Selenium, SQL, Hive, Pyspark, Snowflake, Kafka, Python, R, Spark, Scala, BDD,TDD-Remote-Must need Client round F2F interview

Automation Quality Engineer- Test Automation,Selenium, SQL, Python, R, Sparl, Scala, BDD,TDD

10+ Years- Python Scripting is must and should be expert

Required Skills:

  • Experience in Hadoop ecosystem components: HIVE,Pyspark, HDFS, SPARK, Scala, Snowflake,
  • Streaming,(kinesis, Kafka)
  • Strong experience in PySpark, Phython development
  • Proficient with writing Hive and Impala Queries
  • Ability to write complex SQL queries
  • Experience with AWS Lambda, EMR, Clusters, Partitions, Datapipelines