Talent-Relay

Written By :

Category :

EU

,

Permanent

Posted On :

Share This :

Staff Data Engineer

Location: Luxembourg


Job Type: Permanent


Working Setup: 4 days per week onsite, 1 day remote working


Language: English


Industry: New Space, Earth Observation, Satellite Imagery


Package:  €110,000 -€130,000 + Stock options + relocation package

Introduction:

Talent-Relay has partnered with and expanding Earth Observation and Data Analytics client to onboard an experienced Staff Data Engineer into their expanding Data Engineering team.

Our partner leverages satellite imagery, thermal infrared and multi-spectral data, to deliver valuable insights into agriculture, climate, and commercial clients.

The team is responsible for designing and implement high-quality data pipelines using modern cloud technologies (AWS, Airflow, Dagster, Kubernetes).

We require an experienced hands-on Data Engineer who can lead, guide and mentor the team, working closely with the Data Engineering Manager.

Key Responsibilities:

As a Staff Data Engineer, you will:

  • Design and implement scalable data pipelines using modern cloud technologies.
  • Develop, optimize, and maintain ETL workflows for large-scale data extraction, transformation, and loading.
  • Lead architectural decisions and ensure best practices in cloud infrastructure.
    Enhance automation and monitoring using orchestration frameworks like Apache Airflow or Dagster.
  • Drive performance and scalability improvements across data systems.
    Mentor and support junior engineers through code reviews and knowledge-sharing.
  • Collaborate closely with Science, Product, and Customer Success teams to align technical solutions with business needs.

Skills & Requirements:

We are looking for an experienced and hands-on data engineer who can lead, guide and mentor the team, working closely with the Data Engineering Manager.

  • 10+ years of experience in data engineering
  • Master’s degree in Engineering, Computer Science, Applied Mathematics, or a related field
  • Solid Python and programming skills
  • Expertise in designing efficient and maintainable data pipelines,
  • Strong experience in ETL workflow development
  • Proficiency in cloud platforms (AWS preferred) , Kubernetes, Docker
  • Experience with CI/CD pipelines and infrastructure automation
  • Hands-on experience with workflow orchestration (e.g., Apache Airflow, Dagster)
  • Strong problem-solving skills and ability to debug complex technical issues
    Passion for mentoring and supporting team members

Nice to have:

  • Experience with setting up and maintaining cloud infrastructure (preferably AWS).
  • Experience with provisioning and maintaining Kubernetes clusters.
  • Understanding of MLOps best practices for monitoring, versioning, and maintaining models in production.
  • Experience in Earth observation, AI, geospatial, imaging science, or computer vision fields.