Negotiable
Outside
Remote
USA
Summary: The client is seeking experienced Azure Data Engineers and Architects to design, develop, and maintain data architectural constructs, particularly focusing on Azure Data Factory, Azure Data Lake, and Azure Databricks. Candidates should possess extensive experience in data warehousing, cloud data architecture, and hands-on expertise with various Azure technologies. The role emphasizes the ability to create complex ETL data pipelines and develop data solutions for analytical applications. This position is remote and classified as outside IR35.
Key Responsibilities:
- Design, develop and maintain core data architectural constructs with a special focus on elements of Azure Data Factory, Azure Data Lake and Azure Databricks platforms.
Key Skills:
- Bachelor's or master's degree in computer science, Computer Engineering or Information Systems.
- Minimum 10-15+ years in Data Warehousing, Data Engineering & Analytics Experience.
- Minimum 8+ years of Cloud data architecture with at least 4+ years on Azure.
- 5-7 years experience in Data with one large program from end to end.
- Proven experience as a Data Architect designing and implementing new data solutions in Microsoft Azure platform.
- A minimum of 5 years of experience designing and developing data solutions in both Azure and Databricks.
- Hands-on experience in Azure Data Factory, Azure Synapse, Azure Databricks, Azure SQL Database, Azure Functions, and Logic Apps.
- High proficiency in developing complex ETL Data Pipelines using Databricks with PySpark, manage & maintain Data Lake & Lakehouse (CDC & SCD).
- Excellent understanding of data warehousing concepts and technologies (e.g., dimensional modeling, star schema, data vault, medallion architecture, etc.).
- Experience with designing and developing data solutions using technologies such as Databricks, Snowflake, Dremio, Spark, Hadoop or Kafka.
- Strong proficiency in SQL and experience with relational databases (e.g., Oracle, PostgreSQL, SQL Server).
- Expert knowledge of ETL/ELT tools and frameworks, batch, real-time and near real-time loads.
- Expertise in programming languages such as Python and Spark.
- Familiarity with designing data products for consumption by .Net applications, APIs or analytical solutions, such as Power BI.
- Proven ability to leverage components like Azure Automation, Azure Power Shell Scripting, AZ Copy, Azure DevOps, ARM Templates.
- Ability to visualize and create high-level models (rigorous information-rich diagrams).
- Experience in Unity Catalog is preferred.
- Experience integration with Power BI and defining the approach for data self-service. Proven experience with Power BI, including Semantic Model development and DAX.
Salary (Rate): undetermined
City: undetermined
Country: USA
Working Arrangements: remote
IR35 Status: outside IR35
Seniority Level: undetermined
Industry: IT
Hi,
Client is looking for Azure Data Engineers and Architect to perform the following scope of work:
- Design, develop and maintain core data architectural constructs with a special focus on elements of Azure Data Factory, Azure Data Lake and Azure Databricks platforms.
Skills / Experience / Qualifications Required
- Bachelor s or master s degree in computer science, Computer Engineering or Information Systems.
- Minimum 10-15+ years in Data Warehousing, Data Engineering & Analytics Experience.
- Minimum 8+ years of Cloud data architecture with at least 4 + years on Azure.
- 5-7 yrs experience in Data with one large program from end to end.
- Proven experience as a Data Architect designing and implementing new data solutions in Microsoft Azure platform.
- A minimum of 5 years of experience designing and developing data solutions in both Azure and Databricks.
- Hands-on experience in Azure Data Factory, Azure Synapse, Azure Databricks, Azure SQL Database, Azure Functions, and Logic Apps.
- High proficiency in developing complex ETL Data Pipelines using Databricks with PySpark, manage & maintain Data Lake & Lakehouse (CDC & SCD).
- Excellent understanding of data warehousing concepts and technologies (e.g., dimensional modeling, star schema, data vault, medallion architecture, etc.).
- Experience with designing and developing data solutions using technologies such as Databricks, Snowflake, Dremio, Spark, Hadoop or Kafka.
- Strong proficiency in SQL and experience with relational databases (e.g., Oracle, PostgreSQL, SQL Server).
- Expert knowledge of ETL/ELT tools and frameworks, batch, real-time and near real-time loads.
- Expertise in programming languages such as Python and Spark.
- Familiarity with designing data products for consumption by .Net applications, APIs or analytical solutions, such as Power BI.
- Proven ability to leverage components like Azure Automation, Azure Power Shell Scripting, AZ Copy, Azure DevOps, ARM Templates.
- Ability to visualize and create high-level models (rigorous information-rich diagrams).
- Experience in Unity Catalog is preferred.
- Experience integration with Power BI and defining the approach for data self-service. Proven experience with Power BI, including Semantic Model development and DAX.
If Interested please revert back with below details and updated cv.