Hyderabad, Telangana, India
1 day ago
Software Engineer II- Databrick,PySpark,Cloud+ETL

You’re ready to gain the skills and experience needed to grow within your role and advance your career — and we have the perfect software engineering opportunity for you.

As a Software Engineer II at JPMorganChase within the Corporate Technology, you are part of an agile team that works to enhance, design, and deliver the software components of the firm’s state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role.

Job responsibilities

Supports review of controls to ensure sufficient protection of enterprise dataResponsible for advising and making custom configuration changes in one to two tools to generate a product at the business or customer requestUpdates logical or physical data models based on new use casesFrequently uses SQL and understands NoSQL databases and their niche in the marketplaceDevelop the required data pipelines for moving the data from On-prem to AWS/Cloud platforms.Perform user acceptance testing and deliver demos to stakeholders by SQL queries or Python scripts.Perform data analysis to define / support model development including metadata and data dictionary documentation that will enable data analysis and analytical explorationParticipate in strategic projects and provide ideas and inputs on ways to leverage quantitative analytics to generate actionable business insights and/or solutions to influence business strategies and identify opportunities to growPartners closely with business partners to identify impactful projects, influence key decisions with data, and ensure client satisfactionAdds to team culture of diversity, equity, inclusion, and respectWork on innovative solutions using modern technologies and products to enhance customer experiences

 

 Required qualifications, capabilities, and skills

Formal training or certification on software engineering* concepts and 2+ years applied experience hands-on development experience and knowledge of Cloud, preferably AWS Cloud.Hands-on experience in migrating relational Databases to NoSQL/Bigdata in Cloud.Experience across the data lifecycleAdvanced at SQL (e.g., joins and aggregations, SQL analytical functions).Hands on experience in handling JSON data in SQL.Working understanding of NoSQL databases like TigerGraph, MongoDB, or any other NoSQL DB.Hands on experience in building BigData warehouse using applications.Hands on experience with cloud computing, AWS.Experience with query processing and tuning reports.Experience with ETL and processing real-time data.Experience with Big data technologies like PySpark.

 Preferred qualifications, capabilities, and skills

Databricks experience of 1-2 years.PySpark experience of 3-4 years.ETL, Datawarehouse, Lakehouse experience of 3-4 years. 

 

 

 

Por favor confirme su dirección de correo electrónico: Send Email