Database Engineer (Talent Pool)

CDI
Paris
Salaire : Non spécifié
Télétravail fréquent

Descartes & Mauss
Descartes & Mauss

Cette offre vous tente ?

Questions et réponses sur l'offre

Le poste

Descriptif du poste

At D&M, we’re always on the lookout for exceptional talent to join our dynamic team. This posting is part of our talent pool initiative, where we invite passionate professionals interested in technology and AI to connect with us. By submitting your application, you’ll be considered for future openings that align with your skills and aspirations. Join us in shaping the future of innovation!


We are seeking a skilled Data/Database Engineer to join our team and take ownership of managing, optimizing, and monitoring our data infrastructure. The ideal candidate will have experience working with both SQL and NoSQL databases, building and maintaining ETL pipelines, and ensuring seamless data operations. This role requires expertise in cloud platforms (preferably Google Cloud), containerization, and continuous integration/continuous deployment (CI/CD) practices. The engineer will also ensure data security and compliance with regulations such as GDPR/CCPA. 

Your responsibilities include:

Data Pipeline Management: 

  • Design, implement, and maintain scalable ETL processes using PySpark and SparkNLP

  • Manage data pipelines using GCP Workflows for scheduling and orchestrating jobs

  • Ensure seamless integration and management of data systems to maintain continuous operation. 

Database Management: 

1/ SQL Databases: 

  • Manage and optimize PostgreSQL databases for transactional data and relational database management. 

  • Regularly optimize queries and indexes to ensure high-performance operations. 

  • Implement automated backup and recovery solutions for PostgreSQL to prevent data loss. 

2/ NoSQL Databases: 

  • Manage and optimize NoSQL datasets using Delta Lake for large-scale data. 

  • Ensure NoSQL infrastructure scalability to handle increasing data volumes. 

Infrastructure & Deployment: 

  • Deploy data applications on cloud platforms like Google Cloud. 

  • Utilize Docker for containerized environments and ensure consistency across development, testing, and production environments. 

  • Leverage GCP services for deployment, scaling, and monitoring of data applications. 

  • Set up and manage CI/CD pipelines using GitHub Actions to automate testing, deployment, and version control. 

Monitoring & Performance Optimization: 

  • Monitor data processing systems for latency, throughput, and error rates to ensure optimal performance. 

  • Ensure data quality by regularly checking for consistency, completeness, and accuracy across databases and pipelines. 

  • Implement centralized logging using Google Cloud Logging to aggregate logs from multiple sources. 

Security & Compliance: 

  • Ensure the encryption of data both at rest and in transit. 

  • Implement role-based access control (RBAC) to secure data and model endpoints. 

  • Maintain compliance with regulations such as GDPR and CCPA, including detailed audit logging for model training and data access. 

Documentation & Communication: 

  • Document API endpoints and data pipelines using tools like Swagger for ease of maintenance and onboarding. 

  • Provide data flow diagrams, ETL process documentation, and data schema explanations. 

  • Set up alerts using Google Cloud Monitoring and Slack for real-time issue notifications. 

  • Generate and share performance reports to keep stakeholders informed and facilitate data-driven decision-making.


Profil recherché

Required Skills & Qualifications: 

  • At least 1 year of experience in the role of data/database engineer or related roles.

  • Strong experience in managing both SQL (PostgreSQL) and NoSQL (Delta Lake, Firestore, MongoDB) databases. 

  • Hands-on experience with cloud platforms, preferably Google Cloud. 

  • Proficient in PySpark, SparkNLP, and data pipeline orchestration tools (e.g., GCP Workflows)

  • Expertise in containerization (Docker) and CI/CD pipelines (GitHub Actions)

  • Knowledge of performance metrics (latency, throughput, error rates) and data quality checks (consistency, completeness, accuracy). 

  • Understanding of data encryption, access control (RBAC), and compliance with GDPR/CCPA. 

  • API development (REST/GraphQL) and ML pipeline integration.

  • Strong scripting (Python/Bash) and experience with automation (Terraform, Ansible).

  • Familiar with monitoring tools (Prometheus, Grafana, ELK stack) and big data frameworks.

  • Excellent communication skills and the ability to document and report on technical processes. 

D&M believes diversity drives innovation and is committed to creating an inclusive environment for all employees. We welcome candidates of all backgrounds, genders, and abilities to apply. Even if you don’t meet every requirement, if you’re excited about the role, we encourage you to go for it—you could be exactly who we need to help us create something amazing together!


Déroulement des entretiens

  • 30 mins chat with Julie, Talent Acquisition Specialist

  • 30 mins chat with Harsha, Engineering Lead

  • Meet with Omar and Sebastien from the Tech team

What we offer:

Fixed Flexible working arrangements:

  • A flexible hybrid approach to work;

  • Work from home (or wherever you perform best) up to 3 days per week;

  • 2 days per week in the office with your colleagues (We Work spaces in Paris 9eme);

  • Access to WeWork offices worldwide

An international, diverse, and inclusive work environment:

  • Join a team that already has 18 nationalities and continues to grow;

  • Regular team building activities & drinks after work

  • International company offsite (the most recent one in Tuscany in 2024).

Company benefits:

  • 50% of transportation is reimbursed;

  • Swile card

  • 25 days off + RTT + your birthday off 🎊

  • Mutuelle

Continuous learning and development:

  • A culture that promotes professional growth and well-being, guided managers and peers;

  • Access to training and development resources.

Envie d’en savoir plus ?

D’autres offres vous correspondent !

Ces entreprises recrutent aussi au poste de “Data / Business Intelligence”.

Voir toutes les offres