Data Engineer - CDI (H/F)

Résumé du poste
CDI
Paris
Salaire : Non spécifié
Télétravail fréquent
Compétences & expertises
Contenu généré
Gestion des tâches
Aptitudes à motiver les autres
Adaptabilité
Systèmes de contrôle des versions
Collaboration et travail d'équipe
+14

Veesion
Veesion

Cette offre vous tente ?

Questions et réponses sur l'offre

Le poste

Descriptif du poste

Please apply directly on our career site right here.
All applications submitted via Welcome will not be processed

Veesion is at the forefront of in-store theft detection solutions, transforming how retailers protect their products and optimize their operations. Our technology combines advanced video analysis, AI-driven insights, and intuitive user experiences to deliver real-time prevention and actionable intelligence. As we expand, we’re seeking a Senior Data Engineer to play a critical role in scaling our data capabilities and empowering teams across the organization.

As a Senior Data Engineer, you will play a pivotal role in shaping the future of our data ecosystem. You’ll lead the redesign of our BigQuery Data Warehouse and refine how data is leveraged across the company. This is an opportunity to rethink and elevate our existing architecture, ensuring it aligns seamlessly with evolving business needs. You’ll have the freedom to challenge current technical decisions, introduce best practices, and build scalable solutions from the ground up—making a lasting impact on our growth and success.

Reporting to the Head of Data, you will collaborate with key stakeholders across the organization:

  1. Chief Executive Officer: Align on business objectives and strategic priorities

  2. Business Teams: Partnering with data analysts from finance, revenue operations, sales, customer success, and marketing to deliver actionable insights.

  3. Tech Team: Working alongside 10+ software engineers to integrate data from Veesion’s systems into our decision-making processes.

Tasks

  • Design and refine scalable data models: Ensure data models meet current and future business needs, optimizing for performance and maintainability.

  • Maintain and scale our BigQuery Data Warehouse: Monitor and enhance warehouse performance to support growing data volumes and usage.

  • Build and orchestrate efficient data pipelines: Use modern tools to create, schedule, and monitor data workflows that ensure timely and accurate data delivery.

  • Collaborate with business teams: Work closely with analysts and stakeholders to identify, prioritize, and deliver datasets that drive decision-making.

  • Foster a self-service data culture: Provide training and tools to enable teams to explore data independently while ensuring governance and consistency.

  • Influence and evolve the tech stack: Our current stack includes BigQuery, dbt, Airbyte, Dagster, hightouch, and n8n—but we welcome your ideas to enhance and expand it.


Profil recherché

Ideal profile

  • Education: A Master’s degree or equivalent in Data Engineering, Computer Science, or a related field.

  • Experience: At least 4 years of professional data engineering experience (excluding internships), with a proven track record of leveraging modern data stack tools to deliver high-quality analytics solutions.

Desired Skills

Required Skills:

  • SQL Mastery: Deep knowledge of SQL for designing efficient queries, optimizing databases, and transforming raw data into actionable insights.

  • Python Expertise: Strong Python programming skills, with experience in data manipulation, pipeline creation, and automation.

  • BigQuery & dbt Experience: Demonstrated ability to build, optimize, and maintain solutions using BigQuery and dbt.

  • Data Modeling Proficiency: Solid foundation in data modeling principles, including star and snowflake schemas, and applying them to scalable solutions.

  • Version Control: Proficiency with Git for versioning and collaborative development in data workflows.

  • Problem-Solving Skills: Proven ability to tackle complex technical challenges with structured, innovative solutions.

Preferred Skills :

  • Cloud Expertise: Familiarity with AWS services, including S3, Lambda, or Redshift.

  • Pipeline Orchestration: Experience with tools like Dagster, Apache Airflow, or Prefect for workflow automation.

  • Advanced Visualization: Experience with advanced Metabase features, including custom SQL queries, embedding, and API integration for dynamic visualizations.

  • DevOps/MLOps Practices: Experience with Docker and CI/CD workflows (e.g.,GitHub Actions) to support automated testing and deployment.

  • Software Engineering Mindset: Applies software engineering best practices, including clean code, modular design, and testing strategies, to data engineering projects

Personal Qualities

  • Autonomy: Capable of independently initiating and driving projects from concept to implementation.

  • Adaptability: Thrives in fast-paced, dynamic environments, quickly adjusting to shifting priorities and requirements.

  • Pragmatism: Delivers practical, actionable solutions that balance technical excellence with business needs.

  • Curiosity: Continuously seeks out innovative methods, tools, and practices to improve processes and outcomes.

  • Team Player: Collaborates effectively with cross-functional teams and fosters a culture of shared success and mutual respect.


Déroulement des entretiens

Interview Process

We strive for an inclusive, structured interview process designed to highlight your technical and problem-solving abilities while providing transparency at each stage.Initial Screening (45 minutes, remote): A discussion to review your experience, motivations, and alignment with the role.

1. Technical Interview (1.5 hours, remote): Interview with the Head of Data and Chief Technical Officer to dive deeper into your technical expertise and approach to solving real-world challenges.
2. Final Interview (1 hour, onsite): Meet with our CEO and the rest of the Data team to discuss cultural fit, expectations, and long-term goals.

Benefits

  • Competitive Compensation: Receive a salary that reflects your skills and contributions.

  • Swile Meal Voucher Card: Enjoy meal benefits to make your day easier.

  • Transportation Subsidy: 50% coverage of your transportation costs.

  • Comprehensive Health Insurance: Coverage from day one to ensure your well-being.

  • Inclusive and Supportive Culture: Join a company that values diversity, equity, and inclusion.

  • Dynamic, Motivating Team: Work with passionate colleagues in an exciting, fast-paced environment.

  • Rapid Skill Growth: Take on increasing responsibilities and expand your skillset quickly.

  • Prime Location: Office located in the heart of Paris (Beaubourg).

  • Flexible Work Policy: Work-from-home arrangement (2-3 days per week).

Please apply directly on our career site right here.
All applications submitted via Welcome will not be processed

Envie d’en savoir plus ?

D’autres offres vous correspondent !

Ces entreprises recrutent aussi au poste de “Data / Business Intelligence”.

Voir toutes les offres