Negotiable
Outside
Remote
USA
Summary: This role is for a Contract Data Engineer/Backend Engineer based remotely or in Los Angeles, CA, focusing on cloud migration projects. The position requires a strong technical background in data engineering and backend development, particularly with AWS and Google Cloud Platform. Candidates should possess excellent communication skills and a proactive attitude, with a minimum of 3 years of relevant experience. The contract duration is expected to be over 12 months, with a pay rate of $60 per hour on W2.
Key Responsibilities:
- Software development, design, coding, debugging
- Migration of Search reporting tools to Cloud AWS/Google Cloud Platform
- Successful migration to cloud and deprecating on-prem tools
- Ensure parity of Cloud Tools with on-prem Tools supporting all reporting use cases with no business impact
Key Skills:
- AWS-EMR, Airflow, DBT, Google Cloud Platform, Big Query, BQ-ETL, Python, Spark, Java, Git, Unix/Linux, CI/CD tools
- Strong proficiency in Python
- Experience with distributed processing tools like Apache Spark
- Solid SQL skills and experience with both relational and NoSQL databases
- Hands-on with Google Cloud Platform or AWS and relevant cloud APIs
- Familiarity with Git, CI/CD, and testing frameworks
- Strong communication and time management; can operate independently
- 3+ years of experience in data engineering or backend development
Salary (Rate): £48.00 hourly
City: Los Angeles
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: Mid-Level
Industry: IT
We have Contract role Data engineer/Backend Engineer Remote for our client at Los Angeles CA. Please let me know if you or any of your friends would be interested in this position.
Position Details: Data engineer/Backend Engineer Remote-Los Angeles CA Location : Remote Project Duration : 12+ Months Contract Pay Rate : $60/hr. on W2
Must-have skills/qualifications (technical, soft skills, certifications, tools):
- Technical: AWS-EMR, Airflow, DBT, Google Cloud Platform, Big Query, BQ-ETL, Python, Spark, Java, Git, Unix/Linux, CI/CD tools
- Soft Skills: Communication, Proactive Attitude, Quick Learner
- Ideal experience level (years, leadership, industries):3+ years
- Any preferred industries or companies for background? Internet, Technology, Consumer Products
- Desired personality or work style: Analytical Thinker, Independent and Accountable, Collaborative, Agile and Flexible
- Key attributes or values sought in the candidate: Integrity, Ownership, Good Communication
Minimum Qualifications:
- 3+ years of experience in data engineering or backend development
- Strong proficiency in Python
- Experience with distributed processing tools like Apache Spark
- Solid SQL skills and experience with both relational and NoSQL databases
- Hands-on with Google Cloud Platform or AWS and relevant cloud APIs
- Familiarity with Git, CI/CD, and testing frameworks
- Strong communication and time management; can operate independently
Qualifications
- Primary responsibilities (daily/weekly): Software development, design, coding, debugging
- Key projects or initiatives for the role: Migration of Search reporting tools to Cloud AWS/Google Cloud Platform
- Success metrics or KPIs for this role: Successful migration to cloud and deprecating on-perm tools
- How is success measured? Parity of Cloud Tools with on-perm Tools supporting all reporting use cases with no business impact.