Software Engineer II Python/ Pyspark, SQL, AWS, Databricks
You're ready to gain the skills and experience needed to grow within your role and advance your career - and we have the perfect software engineering opportunity for you.
As a Software Engineer II at JPMorganChase within Employee Platforms, you are part of an agile team that works to enhance, design, and deliver the software components of the firm's state-of-the-art technology products in a secure, stable, and scalable way. As an emerging member of a software engineering team, you execute software solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role.
Job responsibilities
- Executes standard software solutions, design, development, and technical troubleshooting
- Writes secure and high-quality code using the syntax of at least one programming language with limited guidance
- Designs, develops, codes, and troubleshoots with consideration of upstream and downstream systems and technical implications
- Applies knowledge of tools within the Software Development Life Cycle toolchain to improve the value realized by automation
- Applies technical troubleshooting to break down solutions and solve technical problems of basic complexity
- Gathers, analyzes, and draws conclusions from large, diverse data sets to identify problems and contribute to decision-making in service of secure, stable application development
- Learns and applies system processes, methodologies, and skills for the development of secure, stable code and systems
- Adds to team culture of diversity, opportunity, inclusion, and respect
Required qualifications, capabilities, and skills
- Formal training or certification on software engineering concepts and 3+ years applied experience.
- Experience with Databricks: Proficient in building and managing data pipelines, notebooks, and workflows within the Databricks environment.
- Expertise in AWS Data Services: Hands-on experience with AWS data tools such as S3, Glue, Kinesis firehouse, ECS, Lambda, and related services for data storage, processing, and integration.
- Strong SQL Skills: Advanced ability to write and optimize SQL queries for data analysis and transformation within Databricks and AWS databases.
- Strong Python and PySpark Skills: Advanced ability to develop data processing solutions using Python and PySpark for large-scale data transformation and analysis.
- Exposure to agile methodologies such as CI/CD, Application Resiliency, and Security
- Emerging knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.)
- Experience with data modeling, ETL pipeline development, and data transformation.
Want more jobs like this?
Get jobs in Hyderabad, India delivered to your inbox every week.

Preferred qualifications, capabilities, and skills
- Knowledge of big data technologies such as Apache Spark and Kafka.
- Exposure to fullstack development, including frontend frameworks, is a plus.
Perks and Benefits
Health and Wellness
Parental Benefits
Work Flexibility
Office Life and Perks
Vacation and Time Off
Financial and Retirement
Professional Development
Diversity and Inclusion