Senior Data Engineer - Paris/Montpellier

CDI
Salaire : Non spécifié
Télétravail fréquent
Expérience : > 5 ans
Postuler

Swile
Swile

Cette offre vous tente ?

Postuler
Questions et réponses sur l'offre

Le poste

Descriptif du poste

At Swile, we believe that good products can help reduce friction in daily professional life and boost employee satisfaction. Today, we provide innovative solutions in various areas such as Fintech, Travel, HR, and Employee Benefits to more than 5.5 million users in 85,000 companies in France and Brazil.

As a Senior Data Engineer, your primary role is to design and build impactful software data products. This involves driving the technical direction of projects, improving processes, and ensuring the successful delivery of goals and roadmaps. Additionally, you’ll contribute to creating a sustainable engineering environment that supports long-term growth and innovation.

As a Senior Data Engineer, your mission will be to:

  • Contribute to defining and evangelizing the data architecture vision to empower engineers in processing data both in batch and real-time consumption (EDA & APIs).

  • Bring patterns and associated best practices and support innovation teams in implementing them pragmatically.

  • Participate in technical decision-making

  • Develop and maintain a robust Data Platform to enable the ingestion and availability of data

  • Manage our SaaS tools (Stitch, Snowflake, etc.) and contribute to our self-hosted K8S infrastructure (Airflow, Kafka, etc.)

  • Ensure the data processing pipeline

  • Apply best coding practices to support codebase growth without increasing technical debt (version control, testing, refactoring, etc.)

  • Document code and processes to share knowledge within the Engineering team

⚒️ Our tech stack

You do not need to be familiar with our technical stack or any specific functional area, but we have a strong willingness to learn and adapt quickly.

  • Ruby/Rails, Typescript/React/Node.js

  • Android(Kotlin), iOS(Swift)

  • AWS/Kubernetes, PostgreSQL, Kafka, Redis, Snowflake

💡What’s in it for you ?

  • An opportunity to integrate a dynamic team of talented engineers.

  • A collaborative work environment that values innovation and creativity.

  • Competitive salary and benefits package.

  • Professional development and career growth opportunities.


Profil recherché

✨It will be a perfect match if you:

  • Solid background in software engineering

  • Experience with Kafka and event-driven architectures, including designing, deploying, and managing real-time data streaming architectures to enable scalable and responsive systems

  • Extensive experience working with modern datax( and cloud technologies, including Snowflake, Terraform, Airflow and DBT (data build tool) 

  • Proven ability to design, deploy, and optimize robust data pipelines using these tools to drive data-driven decision-making

  • You are a future responsible swiler: you share our commitment to the environment, diversity, fairness and inclusion and are prepared to work every day to improve individual and collective performance.

📓 One thing worth to be mentioned

  • We welcome individuals with entrepreneurial backgrounds as well as those from established organizations. At Swile, we believe that delivering impactful products requires engineers to understand the needs of users and clients as well as the code itself.

Déroulement des entretiens

The Recruitment Process

  • Meet with one of our our Tech Recruiters (30 to 45 minutes)

  • Meet with your future manager (1h)

  • Code live interview with one of our Staff Engineers (1h)

  • Architecture Design live interview with another one of our Staff Engineers (1h)

  • Interview with our VP of Engineering or CTO (1h)

Envie d’en savoir plus ?

D’autres offres vous correspondent !

Ces entreprises recrutent aussi au poste de “Données/Business Intelligence”.

Voir toutes les offres
Postuler