SENIOR DIRECTOR DATA ENGINEERING

Resumen del puesto
Indefinido
Paris
Salario: No especificado
Sin trabajo a distancia
Experiencia: > 7 años
Competencias y conocimientos
Contenido generado
Java
Kubernetes
Nosql
Scala
Terraform
+15

Publicis France
Publicis France

¿Te interesa esta oferta?

Preguntas y respuestas sobre esta oferta

El puesto

Descripción del puesto

As Director in Data Engineering, you will be responsible for 

  • executing Data architecture assessment 
  • defining and implementing data transformation strategy for clients
  • designing and delivering modern and enterprise grade target architecture for data platform, reaching expectations specially in performance, reliability, security and scalability 
  • shaping a cloud native landing zone, supporting modern data architecture patterns (lakehouse, medallion analytics, Data as a service .. ) 
  • contributing to the definition and execution of Data Governance models 

The role requires a hands-on technologist with expertise in data platform delivery at scale in large scope of use cases (either AI or analytics), Cloud based as a must and tactical direction to team and customers. The person has solid hands-on capabilities in data related languages as SQL, Python and Scala as well as Spark based computing program and modern storage architecture using market standards as Parquet and Apache Iceberg. 

As a data engineering practitioner, you should have a point of view and understanding of build vs. buy, performance considerations, hosting, business intelligence, reporting & analytics, team skills management and development, across software engineering, quality automation and testing, data modeling design as well as cloud and solution architecture 

Role & Responsibilities 

  • Provide inputs to define and execute strategic roadmap for enterprise and data architecture by understanding the current landscape as well as setting up the framework for shaping and executing the target architecture and its delivery roadmap. 
  • Provide technical leadership and hands-on implementation role in the areas of data techniques including data ingestion, data transformation/processing, data quality, data modeling, data visualization involving end to end life cycle using agile/scrum implementation approach. 
  • Act as Technical lead of a globally distributed team and manage functional & non-functional scope, quality and timelines. 
  • Help establish standard data practices and frameworks like data governance, data quality, data validation, data security, data privacy, scheduling, monitoring and logging/error management. 
  • Should exhibit and demonstrate Thought Leadership point of views and provide business and technical consulting to the clients and mentorship/coaching to the team members. 

Requisitos

Mandatory Experience and Competencies 

  • overall 15 years of IT experience with 8+ years in Data related technologies 
  • + 3 years expertise in cloud related data services (AWS / Azure / GCP) 
  • have led data audits / assessment, defining data strategy and provide consulting skills to the clients
  • have led technical Architecture, Design and Delivery of Data solutions in production, at scale, based on modern solutions and stacks (AWS, Azure, GCP) 
  • setup best design patterns, coding practices, code review process, automation and quality guidelines and processes
  • expert in data ingestion, distributed data processing (batch and streaming) and programming languages preferably in Java/Scala and/or Python as secondary language
  • lead proposals (RFPs) from solution, architecture, estimation and framework standpoint 
  • should have experience in NoSQL databases and in which use cases to use which one
  • deep understanding of Data Governance, Data Security Data Cataloging and Data Lineage concepts and any tools experience in these areas. 
  • good understanding of Continuous Integration and Continuous Delivery (CI/CD) using Cloud based DevOps services or Jenkins/Bamboo, Maven, Junit, SonarQube, Terraform (one-click infrastructure setup) 
  • should exhibit thought leadership in the areas i.e. writing blogs, creating PoVs, industry trends, attending/presenting in internal/external technical forums, mentorship etc. 
  • excellent communication, presentation and collaboration skills 
  • lead / participate in Data CoE initiatives e.g. building accelerators, knowledge sharing sessions, coaching/mentoring team members 

Preferred Experience and Knowledge

  • Python notebook 
  • data pipeline orchestration 
  • knowledge of Kafka streaming 
  • Parquet storage and Apache Iceberg SQL formating 
  • containers, Dockers and Kubernetes Engine 
  • distributed Messaging Queues (RabbitMQ, ActiveMQ) 
  • data Security : raw level security, multi ternancy pattern, column based masking, filtering, minimization, anonymization/pseudonymization technics 
  • one or more traditional ETL tools experience (Informatica, Talend etc.) 
  • search/Indexing Technologies (Solr) 
  • knowledge of Machine Learning 
  • agile ways of working (Scrum / Kanban) 

Education 

Master’s / Bachelor’s Degree in Computer Engineering, Computer Science, or a related field.