Bengaluru
Experience Range:10 to 15 Years
Role Proficiency:Develop data-driven solutions for business challenges using advanced data warehousing techniques and Databricks. Analyze and interpret large data sets, implement data models, and deliver actionable insights to stakeholders.
Must Have Skills:Data Warehousing
Databricks architecture, deployment, and administration
Python
SQL
ETL/ELT processes
Data Modeling Techniques
Medallion Architecture
Data Security Principles
Agile Methodology
Good to Have Skills:Experience in the financial domain
Databricks certification(s)
Experience with cloud platforms (AWS, Azure, GCP)
Experience in data visualization tools like Tableau or Qlik
Proficiency in Hadoop, Spark, Hive
Experience with version control tools (Git, Bitbucket)
Key Responsibilities:Design and implement data warehouse solutions leveraging Databricks and cloud platforms (AWS, Azure, GCP).
Develop scalable data models and architectures to meet business requirements.
Collaborate with data engineers, developers, and stakeholders to integrate Databricks with application software and data pipelines.
Optimize Databricks performance through efficient data processing, job tuning, and resource management.
Implement data security best practices to safeguard sensitive information.
Monitor and maintain the health of Databricks environments, ensuring proactive issue resolution.
Stay up to date with Databricks best practices, trends, and emerging technologies.
Measures of Outcomes:Number of business processes optimized through data analysis.
Accuracy and efficiency of data models developed.
Successful deployment of Databricks solutions within specified timelines.
Adherence to data security best practices.
Education Qualification:Bachelor’s degree in Computer Science, Information Technology, or related field.