Senior Data Engineer

Senior Data Engineer

Posted 7 days ago by Ubique Systems

Negotiable
Undetermined
Undetermined
Glasgow, Scotland, United Kingdom

Summary: The Senior Data Engineer role involves designing and implementing efficient ETL processes using Python and DataBricks, while collaborating with cross-functional teams to meet data requirements. The position requires ownership of the entire engineering lifecycle, from data extraction to loading, ensuring accuracy and performance. The engineer will also be responsible for maintaining documentation and participating in agile practices such as sprint planning and code reviews.

Key Responsibilities:

  • Collaborating with cross-functional teams to understand data requirements and design efficient ETL processes.
  • Developing and deploying ETL jobs that extract and transform data from various sources.
  • Taking ownership of the end-to-end engineering lifecycle, ensuring accuracy and consistency.
  • Creating and managing data pipelines with proper error handling and performance optimizations.
  • Working in an agile environment, participating in sprint planning, daily stand-ups, and retrospectives.
  • Conducting code reviews and enforcing coding standards.
  • Developing and maintaining tooling and automation scripts.
  • Implementing testing methodologies to ensure reliability of ETL processes.
  • Utilizing REST APIs and other integration techniques to connect data sources.
  • Maintaining documentation, including data flow diagrams and technical specifications.

Key Skills:

  • Proficiency in Python programming.
  • Hands-on experience with DataBricks for scalable data pipelines.
  • Proficiency in Snowflake or similar cloud-based data warehousing solutions.
  • Solid understanding of ETL principles and data integration best practices.
  • Familiarity with agile methodologies.
  • Experience with code versioning tools (e.g., Git).
  • Meticulous attention to detail and problem-solving skills.
  • Knowledge of Linux operating systems.
  • Familiarity with REST APIs and integration techniques.
  • Familiarity with data visualization tools (e.g., Power BI) is a plus.
  • Background in database administration or performance tuning is a plus.
  • Familiarity with data orchestration tools, such as Apache Airflow, is a plus.
  • Previous exposure to big data technologies (e.g., Hadoop, Spark) is a plus.
  • Experience with ServiceNow integration is a plus.

Salary (Rate): undetermined

City: Glasgow

Country: United Kingdom

Working Arrangements: undetermined

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Role Responsibilities You will be responsible for:

  • Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks
  • Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs.
  • Taking ownership of the end-to-end engineering lifecycle, including data extraction, cleansing, transformation, and loading, ensuring accuracy and consistency.
  • Creating and manage data pipelines, ensuring proper error handling, monitoring and performance optimizations
  • Working in an agile environment, participating in sprint planning, daily stand-ups, and retrospectives.
  • Conducting code reviews, provide constructive feedback, and enforce coding standards to maintain a high quality.
  • Developing and maintain tooling and automation scripts to streamline repetitive tasks.
  • Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes
  • Utilizing REST APls and other integration techniques to connect various data sources
  • Maintaining documentation, including data flow diagrams, technical specifications, and processes.

You Have:

  • Proficiency in Python programming, including experience in writing efficient and maintainable code.
  • Hands-on experience with cloud services, especially DataBricks, for building and managing scalable data pipelines
  • Proficiency in working with Snowflake or similar cloud-based data warehousing solutions
  • Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices
  • Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment.
  • Experience with code versioning tools (e.g., Git)
  • Meticulous attention to detail and a passion for problem solving
  • Knowledge of Linux operating systems
  • Familiarity with REST APIs and integration techniques

You might also have:

  • Familiarity with data visualization tools and libraries (e.g., Power BI)
  • Background in database administration or performance tuning
  • Familiarity with data orchestration tools, such as Apache Airflow
  • Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing
  • Experience with ServiceNow integration