Bengaluru, Karnataka, India
1 day ago
Software Engineer III - ETL Informatica Developer (IDMC)

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.  
As a Software Engineer III at JPMorgan Chase within the Corporate Technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives.

 Job responsibilities 

Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problems  Creates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systems   Produces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development  Gathers, analyzes, synthesizes, and develops visualizations and reporting for large, diverse data sets in service of continuous improvement of software applications and systems   Proactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architecture   Contributes to software engineering communities of practice and events that explore new and emerging technologies   Adds to team culture of diversity, equity, inclusion, and respect  

 

 Required qualifications, capabilities, and skills

 Formal training or certification on Unix, Oracle (both SQL & PL-SQL) concepts and 3+ years applied experience   Expertise in Informatica PowerCenter and Informatica Intelligent Data Management Cloud (IDMC) for ETL and data integration. Knowledge of data warehousing concepts, schema design, and performance tuning for ETL processes. Strong expertise in databases, including SQL and PL/SQL skills for querying, data manipulation, and managing database objects. Proficiency with core AWS services (EC2, S3, Lambda, Glue) for data storage, orchestration, and processing. Solid understanding of the Databricks platform, including key features, workspace management, and best practices for scalable data analytics.Understanding of job scheduling tools (such as AutoSys) to automate and monitor ETL processes.Decent exposure to tools like JIRA, Confluence, Service Now etc Strong problem-solving, effective communication, and documentation skills for collaboration with cross-functional teams. Python coding skills for data processing, automation, and custom transformations . Hands-on experience with Databricks and Spark for large-scale data processing and analytics, including writing and optimizing Spark/SQL queries.

 

Preferred qualifications, capabilities, and skills


*   Any exposure on Apache Ni-Fi would be an added advantage

 

 

Por favor confirme su dirección de correo electrónico: Send Email