Hyderabad, Telangana, India
19 hours ago
Software Engineer III PySpark, Databricks

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. 

As a Software Engineer III at JPMorgan Chase within the Corporate Technology, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm’s business objectives.

Job responsibilities

Executes software solutions, design, development, and technical troubleshooting with ability to think beyond routine or conventional approaches to build solutions or break down technical problemsCreates secure and high-quality production code and maintains algorithms that run synchronously with appropriate systemsProduces architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code developmentGathers, analyzes, synthesizes, and develops visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systemsProactively identifies hidden problems and patterns in data and uses these insights to drive improvements to coding hygiene and system architectureContributes to software engineering communities of practice and events that explore new and emerging technologies

Adds to team culture of diversity, equity, inclusion, and respect

 

Required qualifications, capabilities, and skills

Formal training or certification on software engineering concepts and 3+ years applied experienceOver 4 years of practical experience with Spark, SQL, Databricks, and the AWS cloud ecosystem.Expertise in Apache NiFi, Lakehouse/Delta Lake architectures, system design, application development, testing, and ensuring operational stability.Strong programming skills in PySpark and SparkSQL.Proficient in orchestration using Airflow.In-depth knowledge of Big Data and data warehousing concepts.Experience with CI/CD processes.Solid understanding of agile methodologies, including DevOps practices, application resiliency, and security measures.Solid understanding of agile methodologies such as CI/CD, Application Resiliency, and SecurityDemonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)

 

Preferred qualifications, capabilities, and skills

 

Familiarity with Snowflake, Terraform, and LLM.Exposure to cloud technologies such as AWS Glue, S3, SQS, SNS, Lambda, etc.AWS certifications such as SAA, Associate Developer, Data Analytics Specialty, or Databricks certification
Por favor confirme su dirección de correo electrónico: Send Email