PS|Manager Data Engineering|Big Data|Delivery|Engineering|Data Engineering|Data Engineer
Publicis Groupe
**Company description**
**Manager, Data Engineering**
**Company Description**
Publicis Sapient is a digital transformation partner helping established organizations get to their future, digitally-enabled state, both in the way they work and the way they serve their customers. We help unlock value through a start-up mindset and modern methods, fusing strategy, consulting and customer experience with agile engineering and problem-solving creativity. United by our core values and our purpose of helping people thrive in the brave pursuit of next, our 20,000+ people in 53 offices around the world combine experience across technology, data sciences, consulting, and customer obsession to accelerate our clients s businesses through designing the products and services their customers truly value.
**Overview**
As a Manager, Data Engineering at Publicis Sapient, you will guide clients through complex data challenges, architecting and implementing innovative solutions that drive digital transformation. Your role will be focused on delivering high-quality solutions by independently driving design discussions related to Data Ingestion, Transformation & Consumption, Data Storage and Computation Frameworks, Performance Optimizations, Infrastructure, Automation & Cloud Computing, and Data Governance & Security. The role requires a hands-on technologist with expertise in Big Data solution architecture and with a strong programming background in Java / Scala / Python.
You will work closely with clients across industries, helping them navigate their digital transformation journeys by delivering scalable, high-quality data solutions.
**Responsibilities**
**Your Impact:**
+ Provide technical leadership and hands-on implementation role in the areas of data engineering including data ingestion, data access, modeling, data processing, visualization, design, and implementation.
+ Lead a team to deliver high quality big data technologies-based solutions on Azure Cloud. Manage functional & nonfunctional scope and quality.
+ Help establish standard data practices like governance and address other non-functional issues like data security, privacy, and quality.
+ Manage and provide technical leadership to a data program implementation based on the requirement using agile technologies.
+ Participate in workshops with clients and align client stakeholders to optimal solutions.
+ Consulting, Soft Skills, Thought Leadership, Mentorship etc.
+ People management, contributing to hiring and capability building.
**Qualifications**
**Your Skills & Experience:**
+ Overall 8+ years of IT experience with 3+ years in Data related technologies, and expertise of 1+ years in data-related Azure Cloud services and delivered at least 1 project as an architect.
+ Mandatory to have knowledge of Big Data Architecture Patterns and experience in the delivery of end-to-end Big Data solutions on Cloud (Azure/AWS/GCP)
+ Expert in programming languages like Java/ Scala and good to have Python
+ Expert in at least one distributed data processing framework: Spark (Core, Streaming, SQL), Storm or Flink, etc.
+ Expert in Hadoop eco-system with Azure cloud distribution and worked at least on one or more big data ingestion tools (Sqoop, Flume, NiFI, etc), distributed messaging and ingestion frameworks (Kafka, Pulsar, Pub/Sub, etc) and good to know traditional tools like Informatica, Talend, etc
+ Should have worked on any NoSQL solutions like Mongo DB, Cassandra, HBase, etc, or Cloud-based NoSQL offerings like DynamoDB, Big Table, etc.
+ Good Exposure in development with CI / CD pipelines. Knowledge of containerization, orchestration, and Kubernetes engine would be an added advantage.
+ Experience with Informatica (nice to have).
+ Basic knowledge of Gen AI (good to have).
**Additional information**
**Set Yourself Apart With:**
+ Certification on GCP/AWS any cloud platform or big data technologies.
+ Strong analytical and problem-solving skills.
+ Excellent understanding of data technologies landscape/ecosystem
+ Experience or exposure to ML/AI engineering
+ Experience with containerization and associated microservice tooling such as Docker, and Kubernetes.
+ Knowledge of data security (authentication, authorization, encryption for data at rest and in transit).
+ Understanding of monitoring and alerting tools for data environments.
+ Exposure to data governance, cataloging, and lineage tools.
+ Cloud or data technology certifications.
+ Active participation in the Data Engineering community (blogs, keynotes, POCs, hackathons).
+ Experience or exposure to working with Software or Platform engineering teams
+ A Bachelor s or Master’s degree in Computer Engineering, Computer Science, or a related field.
Por favor confirme su dirección de correo electrónico: Send Email