£450 Per day
Undetermined
Remote
England, UK
Summary: We're looking for a DevOps Data Engineer for a fully remote contract position focused on developing an automated data platform for real-time data ingestion, validation, and reporting. The role requires strong expertise in Azure-based data engineering tools, Python scripting, and scalable pipeline design. The ideal candidate will contribute to the development and maintenance of ETL/ELT pipelines and automated reporting integrations. This position emphasizes collaboration and quality within an agile environment.
Key Responsibilities:
- Develop and maintain ETL/ELT pipelines for ingesting benchmark data (eg, CSVs from Qualtrics).
- Implement automated data quality checks across pipeline stages.
- Build event-driven workflows using Azure Data Factory and Databricks.
- Support automated reporting integrations (Power BI and PowerPoint).
- Optimize storage and processing within Azure Data Lake and SQL-based systems.
- Collaborate on data modelling (star/snowflake schemas) with architects and analysts.
- Monitor and troubleshoot data platform components using Azure Monitor.
- Contribute to CI/CD practices and documentation for long-term maintainability.
Key Skills:
- Advanced Python Scripting and data manipulation.
- Strong SQL for querying and transformation.
- Hands-on with Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure SQL.
- Understanding of data modelling techniques and governance.
- Experience with Azure Monitor, Key Vault, and managed identities.
Salary (Rate): £450.00 per day
City: undetermined
Country: UK
Working Arrangements: remote
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
Overview:
We're seeking a skilled DevOps Data Engineer for a fully remote contract role. You'll support the development of an automated data platform enabling Real Time ingestion, validation, and reporting. The ideal candidate will have strong experience with Azure-based data engineering tools, Python Scripting, and scalable pipeline design.
Key Responsibilities:
- Develop and maintain ETL/ELT pipelines for ingesting benchmark data (eg, CSVs from Qualtrics).
- Implement automated data quality checks across pipeline stages.
- Build event-driven workflows using Azure Data Factory and Databricks.
- Support automated reporting integrations (Power BI and PowerPoint).
- Optimize storage and processing within Azure Data Lake and SQL-based systems.
- Collaborate on data modelling (star/snowflake schemas) with architects and analysts.
- Monitor and troubleshoot data platform components using Azure Monitor.
- Contribute to CI/CD practices and documentation for long-term maintainability.
Essential Skills:
- Advanced Python Scripting and data manipulation.
- Strong SQL for querying and transformation.
- Hands-on with Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure SQL.
- Understanding of data modelling techniques and governance.
- Experience with Azure Monitor, Key Vault, and managed identities.
Desirable:
- Familiarity with AI/ML data patterns (eg vector databases, RAG).
- Automated Power BI or PowerPoint reporting experience.
- Exposure to DevOps tools (CI/CD, Git, infrastructure as code).
Environment:
- Agile, delivery-focused culture with rapid feedback loops.
- Strong focus on quality, automation, and cross-functional collaboration.
- High-impact data platform supporting analytics and automation initiatives.
GCS is acting as an Employment Business in relation to this vacancy.