Lead and manage end-to-end data pipeline projects leveraging Databricks and Azure Data Factory.
Collaborate with cross-functional teams including data engineers, analysts, and business stakeholders to gather requirements and deliver data products.
Ensure proper unit testing and system testing of data pipelines to capture all files/transactions.
Monitor and troubleshoot data ingestion processes, ensuring accuracy and completeness of the data.
Facilitate Agile ceremonies (sprint planning, standups, retrospectives) and ensure the team adheres to Agile best practices.
Define, prioritize, and manage product backlogs based on stakeholder inputs and data insights.
Drive continuous improvement by evaluating emerging technologies and optimizing data integration workflows.
Must-Have Skills:Databricks – Minimum 3 years of hands-on experience.
Azure Data Factory – Minimum 3 years of development and orchestration experience.
CQL (Cassandra Query Language) – Minimum 5 years of strong understanding and practical application.
Strong verbal communication skills – Ability to collaborate effectively with technical and non-technical teams.
Deep understanding of the Agile development process.
Good-to-Have Skills:Experience in Azure Data Lake, Synapse Analytics, or Power BI.
Familiarity with DevOps and CI/CD pipelines in Azure.
Exposure to Data Governance and Data Quality frameworks.
Experience with cloud security best practices related to data handling.
Background in Business Intelligence (BI) and reporting systems.