Sr. Data Engineer

Sr. Data Engineer

Posted Today by 1765350327

Negotiable
Outside
Remote
USA

Summary: The Sr. Data Engineer will play a crucial role in developing a cloud-based Big Data platform that supports advanced analytics and machine learning capabilities. This position involves handling large data volumes, integrating systems, and collaborating with an agile team to maintain high-quality documentation and code. The role also includes mentoring responsibilities for senior-level candidates, emphasizing the importance of experience in team leadership. The ideal candidate will have expertise in data engineering, cloud technologies, and CI/CD frameworks.

Key Responsibilities:

  • Assist in building a world-class Big Data platform for processing data streams and enabling machine learning.
  • Handle large volumes of data and integrate the platform with internal and external systems.
  • Work with an agile team alongside business, testers, architects, and project managers.
  • Focus on the development of complex logic integrations.
  • Maintain and evaluate the quality of documentation, code, and business logic.
  • Prioritize non-functional requirements by maintaining code and supporting performance for deliveries.
  • Design, implement, and extend core data systems for reporting and data visualizations.
  • Manage data integrations within the company's technology stack.
  • Provide runtime and automation solutions for public cloud workloads.
  • Maintain and support all data workflows.
  • Design and support CI/CD frameworks and public cloud infrastructure.
  • Produce and maintain complex data workflows meeting quality requirements.
  • Document database architecture and maintain operational data stores.
  • Handle data ingestion and extraction using MDM tools.
  • Build database schemas, tables, procedures, and permissions.
  • Develop database utilities and automated reporting.
  • Prepare documentation for activities and information conveyance.
  • Full-stack design and operation of core data stack including data lake and warehouse.
  • Experience in building data flows for acquisition and modeling.
  • Work with public cloud providers (AWS, Google Cloud Platform, Azure).
  • Build and manage CI/CD pipelines.
  • Create and manage Kubernetes clusters.
  • Work with containerized workflows and applications.
  • Build ingestion and ETL data pipelines using code-oriented systems.
  • Operate in secure networking environments.
  • Expertise in data engineering languages such as Python, Java, Scala, SQL.
  • Familiarity with data visualization tools like Power BI and Tableau.
  • Create business requirements documents and related application systems documents.

Key Skills:

  • Expertise in data engineering languages (Python, Java, Scala, SQL).
  • Experience with cloud technologies (AWS, Google Cloud Platform, Azure).
  • Knowledge of CI/CD frameworks and container solutions.
  • Familiarity with data warehousing and MDM tools (Informatica, Amperity).
  • Experience in building and managing Kubernetes clusters.
  • Proficiency in data visualization tools (Power BI, Tableau).
  • Ability to design and document database architecture.
  • Experience with ETL data pipelines and data workflows.
  • Strong understanding of agile methodologies and team collaboration.
  • Experience in creating business requirements documents.

Salary (Rate): undetermined

City: undetermined

Country: USA

Working Arrangements: remote

IR35 Status: outside IR35

Seniority Level: undetermined

Industry: IT

Detailed Description From Employer:

W2 Only.

What You Will Do:

Assist in building a world-class Big Data platform which will give us power to process streams of data , as well as, enable machine learning and advanced analytics capabilities. Everything cloud-based for scalability and speed to market.

Handle large volumes of data and integrate our platform with a range of internal and external systems.

Understand new tech and how it can be applied to data management.

Work with an agile team alongside business, testers, architects and project managers.

Focus on the development of complex logic integrations

Maintain and evaluate quality of documentation, code, and business logic and non-functional.

Keep NFRs as priority by maintaining code, supporting, restoring, monitoring and performance for any delivery

*If you join us as a Senior, you will be a mentor to a team, and will need to bring some previous experience of this.

What You Need:

Design, implement, and extend core data system that enables reporting and data visualizations

Manages data integrations within the company's domain technology stack

Provide runtime and automation solutions that empower developers to migrate and run workloads in the public cloud

Responsible for maintaining and supporting all data workflows

Design, implement, enhancement and support of CI/CD frameworks, container solutions, runtime environments, and supporting public cloud infrastructure

Produce and maintain complex data workflows to meet all the quality requirements of the data management policy

Design, and documents database architecture

Responsible for creating and maintaining operational data store

Responsible for ingestion and extraction of data using MDM tools like Informatica, Amperity, etc.

Expertise in Data Warehousing and familiarity with cloud offerings for warehouses.

Creates and maintains diagnoses, alerting, and monitoring code.

Builds database schemas, tables, procedures, and permissions

Develops database utilities and automated reporting

Prepares written materials for the purpose of documenting activities, providing written reference, and/or conveying information

Full-stack design, development, deployment, and operation of core data stack including data lake, data warehouse, and data pipelines

Experience building data flow for data acquisition, aggregation, and modeling, using both batch and steaming paradigms

Experience working public cloud provider (AWS, Google Cloud Platform, Azure)

Experience building and managing CI/CD pipelines

Have created and managed Kubernetes clusters in different types of environments

Familiarity with access controls, secrets management, monitoring, and service discovery in Kubernetes clusters

Experience working with containerized workflows, applications, and drive container adoption among developers and teams

Experience building ingestion, ETL data pipelines, especially via code-oriented systems like Spark, Airflow, Luigi, or similar, and with varied data formats

Experience operating in a secure networking environment (e.g. behind a corporate proxy) is a plus

Expertise in data engineering languages such as Python, Java, Scala, SQL

Familiarity with visualizing data with Power BI, Tableau, and similar tools

Experience creating business requirements documents and/or other application systems related documents