Data Engineer

Data Engineer

Posted 1 week ago by NLB Services

Negotiable
Undetermined
Hybrid
Glasgow, Scotland, United Kingdom

Summary: The Data Engineer role based in Glasgow involves developing data pipelines and data warehousing solutions, primarily using Python and cloud services like Databricks. The position is contract-based for 6 to 12 months and requires extensive experience in handling complex data environments. The role also emphasizes familiarity with various data tools and technologies, including Snowflake and data visualization libraries.

Key Responsibilities:

  • Develop data pipelines and data warehousing solutions using Python and relevant libraries.
  • Utilize cloud services, particularly Databricks, for scalable data pipeline management.
  • Work with Snowflake or similar cloud-based data warehousing solutions.
  • Handle data development in complex environments with large data volumes.
  • Employ code versioning tools like Git for version control.
  • Manage Linux operating systems and REST APIs for integration.
  • Utilize data visualization tools and libraries such as Power BI.
  • Engage in database administration and performance tuning.
  • Use data orchestration tools like Apache Airflow.
  • Apply big data technologies for large data processing.

Key Skills:

  • 4+ years of experience in developing data pipelines and data warehousing solutions using Python.
  • 3+ years of hands-on experience with cloud services, especially Databricks.
  • 3+ years of proficiency in Snowflake or similar cloud-based data warehousing solutions.
  • 3+ years of experience in complex data environments with large data volumes.
  • Experience with code versioning tools (e.g., Git).
  • Knowledge of Linux operating systems.
  • Familiarity with REST APIs and integration techniques.
  • Experience with data visualization tools (e.g., Power BI).
  • Background in database administration or performance tuning.
  • Familiarity with data orchestration tools, such as Apache Airflow.
  • Exposure to big data technologies (e.g., Hadoop, Spark).

Salary (Rate): undetermined

City: Glasgow

Country: United Kingdom

Working Arrangements: hybrid

IR35 Status: undetermined

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

Data Engineer

Location - Glasgow (hybrid) 3 days in a week

Contract role (6 to 12 Months)

Skills / Qualifications:

  • 4+ years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc.
  • 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines
  • 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions
  • 3+ years of experience in data development and solutions in highly complex data environments with large data volumes.
  • Experience with code versioning tools (e.g., Git)
  • Knowledge of Linux operating systems
  • Familiarity with REST APIs and integration techniques
  • Familiarity with data visualization tools and libraries (e.g., Power BI)
  • Background in database administration or performance tuning
  • Familiarity with data orchestration tools, such as Apache Airflow
  • Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing