
Azure Databricks Data Engineer - 100% Remote - Fulltime Permanent Hire
Posted 1 day ago by 1754025891
Negotiable
Outside
Remote
USA
Summary: The Azure Databricks Data Engineer role is a full-time permanent position focused on data engineering with a strong emphasis on Databricks and cloud technologies. The ideal candidate will have over 14 years of experience and will work 100% remotely. Key responsibilities include developing ETL/ELT pipelines and optimizing data workflows. Strong analytical and communication skills are essential for collaborating with diverse teams.
Key Responsibilities:
- Develop and optimize ETL/ELT pipelines using Databricks and cloud technologies.
- Perform data modeling and automation of data workflows.
- Tune performance of data pipelines and workflows.
- Utilize Azure cloud components such as Azure Data Factory, Azure DataBricks, and Azure Data Lake.
- Work with Delta Lake, Delta Live Tables, Autoloader, and Unity Catalog.
- Collaborate with teams to analyze and disseminate data effectively.
Key Skills:
- 7-12 years of experience in a Data Engineering role.
- Bachelor's degree in computer science, Information Technology, or related field.
- Strong proficiency in PySpark, Python, and SQL.
- Experience with data modeling, ETL processes, and data warehousing.
- Knowledge of the insurance industry and its data requirements is preferred.
- Excellent communication and problem-solving skills.
- Ability to work under tight deadlines.
Salary (Rate): undetermined
City: undetermined
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Job Title: Azure Databricks Data Engineer
Duration: Fulltime Permanent Hire
Position: 100% Remote
Note:
We are looking for 14+ years of experience with Data Engineer and someone who can work Fulltime Permanent Hire
Candidate Profile:
- 7-12 years experience on Data Engineering role working with Databricks & Cloud technologies.
- Bachelor s degree in computer science, Information Technology, or related field.
- Strong proficiency in PySpark, Python, SQL.
- Strong experience in data modeling, ETL/ELT pipeline development, and automation
- Hands-on experience with performance tuning of data pipelines and workflows
- Proficient in working on Azure cloud components Azure Data Factory, Azure DataBricks, Azure Data Lake etc.
- Experience with data modeling, ETL processes, Delta Lake and data warehousing.
- Experience on Delta Live Tables, Autoloader & Unity Catalog.
- Preferred - Knowledge of the insurance industry and its data requirements.
- Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy.
- Excellent communication and problem-solving skills to work effectively with diverse teams
- Excellent problem-solving skills and ability to work under tight deadlines.