Negotiable
Undetermined
Hybrid
Glasgow, Scotland, United Kingdom
Summary: The AWS Data Engineer - UI Development role focuses on delivering production-grade data engineering solutions with a strong emphasis on SQL and programming languages, particularly Python and PySpark. The position requires experience with orchestration tools and the ability to design and optimize complex data pipelines while collaborating with data scientists and stakeholders. The role also involves contributing to modernization efforts by migrating workloads from AWS to Databricks. This position is based in Glasgow with a hybrid working arrangement.
Key Responsibilities:
- Deliver production-grade data engineering solutions.
- Proficiency in SQL and programming languages (Python/PySpark preferred).
- Utilize orchestration tools such as Airflow and AWS Step Functions.
- Design and optimize complex data pipelines independently.
- Develop and maintain data warehouses and data lakes.
- Collaborate with data scientists to build and deploy machine learning models.
- Translate business requirements into technical design and implementation.
- Contribute to modernization efforts by migrating workloads from AWS to Databricks.
Key Skills:
- AWS
- Python
- PySpark
- Airflow
- Data lakes
- Banking
- Financial Service
- Databricks
Salary (Rate): undetermined
City: Glasgow
Country: United Kingdom
Working Arrangements: hybrid
IR35 Status: undetermined
Seniority Level: undetermined
Industry: IT
I am hiring for AWS Data Engineer - UI Development
Location: Glasgow - Hybrid / 2-3 days Per week in Office
Strong hands-on experience delivering production-grade data engineering solutions. Proficiency in SQL and programming languages (Python / PySpark strongly preferred). Experience with orchestration tools such as Airflow, AWS Step Functions, etc. Demonstrated ability to independently design and optimize complex data pipelines. Develop and maintain data warehouses and data lakes aligned with volume, velocity, and security requirements. Collaborate with data scientists to build and deploy machine learning models. Engage with stakeholders to translate business requirements into technical design and implementation. Contribute to modernization efforts by migrating workloads from AWS to Databricks.
Key Skills: AWS / Python / PySpark / Airflow / Data lakes / Banking / Financial Service / Databricks