Data Engineer

Data Engineer

Posted 1 week ago by Trust In SODA

Negotiable
Undetermined
Remote
Madrid

Job description








Job Title: Data Engineer Migration (Splunk to Elastic)
Location: Remote (must be in EU)
Contract Type: 12month (will extend)

About the Role:
Trust in Soda is partnering with a prominent Swiss financial exchange to support their major infrastructure transformation project. The client is seeking experienced Data Engineers to join their team and play a key role in migrating their data platform from Splunk to Elastic. This is a unique opportunity to work on a high-impact project in the financial services sector, helping to build cutting-edge data solutions for a global organization.
As a Data Engineer, you will be responsible for designing, building, and optimizing scalable data pipelines and architectures to facilitate the smooth migration of data and analytics platforms. You will work closely with cross-functional teams, including data scientists, DevOps engineers, and business analysts, to ensure the transition from Splunk to Elastic is seamless and effective.
Key Responsibilities:
  • Migration Expertise: Lead and support the migration of data pipelines, dashboards, and alerting from Splunk to Elastic.
  • Data Architecture & Engineering: Design and implement scalable data architectures to ensure efficient data processing and storage in Elastic.
  • Data Integration: Work on integrating data from multiple sources and ensure it is structured in a way that is optimal for analytics and querying in Elastic.
  • Performance Optimization: Optimize data queries, search performance, and overall platform stability within the Elastic stack.
  • Collaboration: Work closely with various internal teams (Data Science, IT, DevOps) to deliver the migration on time and with high quality.
  • Troubleshooting & Support: Provide technical expertise in troubleshooting issues during and after the migration process.
Required Skills & Experience:
  • Hands-on Data Engineering experience in designing and building robust data pipelines.
  • Experience with Splunk and Elastic (ELK Stack) is essential, with a deep understanding of how data is indexed, stored, and queried in both systems.
  • Proficiency in data processing languages such as Python, Scala, or Java.
  • Cloud technologies experience (AWS, GCP, Azure) for building scalable solutions.
  • Familiarity with data modeling, ETL processes, and data warehousing.
  • Experience working in the financial services industry or similar highly regulated sectors is a plus.
  • Strong problem-solving skills and the ability to troubleshoot complex data issues.
  • Ability to work independently and collaboratively in an agile, fast-paced environment.
  • Fluency in English (written and spoken); knowledge of German, French, or Italian is a plus.