This position is no longer available.

Mid Analytics Engineer (All genders)

Job summary
Permanent contract
Paris
Salary: Not specified
No remote work
Experience: > 4 years
Skills & expertise
Generated content
Business acumen
Adaptability
Collaboration and teamwork
Java
Kubernetes
+8

Dailymotion
Dailymotion

Interested in this job?

Questions and answers about the job

The position

Job description

Our team context:

Dailymotion is seeking an Analytics Engineer to join the Data Engineering Teams. Our craft is responsible for most of the Data products at Dailymotion; your work will have an impact throughout Dailymotion’s business and help make data-driven decisions on products and strategy.

You'll join the Data Engineering craft at Dailymotion, to create and maintain data products. The Analytics Engineering team focuses on providing reliable data for analysis company-wide. This includes building and managing our multi-petabyte data warehouse, highly scalable client-facing analytics, data ingestion & distribution, and data synchronization systems. As an Analytics Engineer, you'll blend software engineering and data skills to maintain code and model data effectively. If you're eager to solve challenging business problems, this role offers a broad impact across all of Dailymotion's businesses.

Responsibilities:

Our stack runs almost exclusively on Google Cloud Platform. You will work in an environment made up of Data Lakes (BigQuery, etc.), data streaming platforms (Beam / Dataflow, Flink, etc.), orchestration and scheduling platforms (Airflow), container-oriented deployment, and management platforms (Docker, K8S, JenkinsX), SQL, Data Quality Tools (DBT, Sifflet). You will also participate in data modeling activities and design of data flows until their implementation and support in production.

  • Ingest extensive volumes of raw data from both internal and external sources using batch and streaming methods.
  • Expose the data through various means such as APIs, datamarts, and flat files for both internal and external users.
  • Build complex and efficient SQL queries to transform data within our data lake into reliable business entities and reporting aggregates. Identify and manage dependencies for these transformations, scheduling them using tools like Airflow.
  • Investigate discrepancies and quality issues in the data, as well as addressing performance issues.
  • Design optimized and cost-efficient data models in BQ while addressing business use cases.
  • Design Druid (preferred) datasets tailored for external consumers, prioritizing speed, consistency, cost-effectiveness, and efficiency.
  • Ensure data cleanliness, consistency, and availability by performing data quality checks and implementing monitoring.
  • Catalog and document various aspects of the data, including business entities, datamarts, dimensions, metrics, and business rules.
  • Serve as a subject matter expert on business entities and datamarts, providing training to users on SQL and analytics best practices (collaboration with Business Insight).
  • Innovate by proposing new tools, processes, documentation, and exploring emerging technologies during designated cool-down periods.
Informations supplémentaires

What we offer you:

• Additional opportunities as we grow and learn together.

• Join our open, collaborative culture.

• Exciting, dynamic projects to work on.

• Flexibility.

For the France offices 

  • 🏡Hybrid Work Framework (4 types of remote work : Full office /Flex office (1/2 days remote) / Flex remote (1/2 days at the office) / Full remote + ability to work 3 month abroad)
  • 💰 Saving Plan Vivendi 
  • 🍼  Paternity leave or Coparental leave extended 

    🕶️  Living Employee Culture (Events / Trainings / Partys / All hands / Dailymotion tradition…) 
  • 🚀  Career development support (training / internal mobility / compensation cycle / 360 quarter feedback review …)
  • 🏥  High-end Health Insurance and Personal Services Vouchers (CESU)
  • ⛱️  Paid Time off – RTT and Saving time plan (CET)
  • ✅  Meal Vouchers – Public Transport and Bike refund 
  • 🎡 European Economic and Social Committee (sport membership/cinemas vouchers/gift vouchers/discount)  


Preferred experience

Required skills:

  • Minimum of 2-3 years of experience in Data Analytics.
  • Fluent in both English (pro level) and French.
  • Strong team player, actively contributing to continuous improvement and fostering effective collaboration.
  • Proactive approach to continuous technical exploration, with a keen interest in implementing new technologies and innovative solutions, including prototyping.
  • Ability to draft technical specifications.
  • Hands-on experience in building and managing analytics pipelines.
  • Capable of making informed decisions based on business opportunities.
  • Motivated to work on building batch and streaming data pipelines to handle a high volume of events.
  • Proficiency in various programming languages such as Python, Golang (preferred) and Java with a focus on matching development standards to ensure the delivery of reusable and high-quality code.
  • Hands-on experience with different types of databases and strong SQL knowledge.
  • Experience in establishing and maintaining unit, integration, and end-to-end tests.
  • Experience in automating deployments using tools such as Docker, Kubernetes and Datadog.
  • Proficiency in monitoring systems and tools for ensuring the health and performance of data pipelines.
  • Good level knowledge on how to analyze, design, or improve the efficiency, scalability, and stability of data collection, storage, and retrieval processes.

If you meet these qualifications and are passionate about leveraging data to drive insights and innovation, we encourage you to apply.

Want to know more?

These job openings might interest you!

These companies are also recruiting for the position of “Data / Business Intelligence”.

See all job openings