Pune, IND
73 days ago
Lead Application and Product Architect

ESSENTIAL DUTIES AND RESPONSIBILITIES

· Work with business users and stakeholders to define and analyze problems and provide optimal technical solutions.

· Translate business requirements to technical specifications.

· Involved in design and architecture of the entire scalable data lake architectures using Azure Data Lake (ADLS Gen2), Delta Lake, and Iceberg.

· Create and maintain design of data ingestion, transformation, and storage strategies for real-time and batch workloads along with orchestration, and semantic layer.

· Provide solution design for real-time streaming solutions using Confluent Kafka and Flink.

· Design batch data pipelines using Azure Databricks and Delta Lake.

· Design data pipeline and data refresh process as per business requirements.

· Present architecture and solutions to executive-level.

· Adhere to industry best-practices in all phases of design and architecture of the solution.

· Provide guidance to ensure data governance, security, and compliance best practices in the architecture.

REQUIRED SKILLS & QUALIFICATIONS

TECHNICAL SKILLS:

· Cloud & Data Lake: Azure Data Lake (ADLS Gen2), Databricks, Delta Lake, Iceberg

· Reporting tools: PowerBI, Tableau or similar toolset

· Streaming & Messaging: Confluent Kafka, Apache Flink, Azure Event Hubs

· Big Data Processing: Apache Spark, Databricks, Flink SQL, Delta Live Tables

· Programming: Python (PySpark, Pandas), SQL

· Storage & Formats: Parquet, Avro, ORC, JSON

· Data Modeling: Dimensional modeling, Data Vault, Lakehouse architecture

MINIMUM QUALIFICATIONS

· 8 + years of end-to-end design and architecture of enterprise level data platform and reporting/analytical solutions.

· 5+ years of expertise in real-time and batch reporting, analytical solution architecture.

· 4+ years of experience with PowerBI, Tableau or similar technology solutions

· 3+ years of experience with design and architecture with big data solution.

· 3+ years of hands-on experience in enterprise level streaming data solution with Python, Kafka/Flink and Iceberg.

ADDITIONAL QUALIFICATIONS

· 8 + years of experience with Dimensional modeling and data lake design methodologies.

· 8+ years of experience with Relational and Non-relational databases (e.g. SQL Server, Cosmos, etc.)

· 3 + years of experience with readiness, provisioning, security, and best practices with Azure data platform and orchestration with Data Factory.

· Experience with working with business stakeholders, requirements & use case analysis.

· Strong communication and collaboration skills with creative problem-solving skills.

PREFERRED QUALIFICATIONS

· Bachelor's degree in computer science or equivalent work experience.

· Experience with Agile/Scrum methodology.

· Experience with tax and accounting domain a plus.

· Azure Data Engineer certification a plus.

Applicants may be required to appear onsite at a Wolters Kluwer office as part of the recruitment process.

Por favor confirme su dirección de correo electrónico: Send Email