Skip to main contentA logo with &quat;the muse&quat; in dark blue text.

Software Engineer III - Python, LLM, AWS

Yesterday Hyderabad, India

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.

As a Software Engineer III at JPMorganChase within the Consumer and community banking, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives.

Job responsibilities

  • Work with large-scale datasets, performing advanced queries and calculations to extract, transform, and analyze data.
  • Design, develop, and optimize ETL pipelines using AWS Glue, PySpark, and Databricks.
  • Link and integrate processed data to downstream tables and systems via AWS Glue workflows.
  • Integrate AI/ML models (SageMaker or custom) into data workflows and production pipelines.
  • Utilize AWS services including S3, Lambda, Redshift, Athena, Step Functions, MSK, EKS, and Data Lake architectures.
  • Design and implement scalable data models, data warehousing, and data lake solutions.
  • Collaborate with data scientists, engineers, and business stakeholders to deliver high-quality data solutions.
  • Use version control (Git) and CI/CD pipelines for efficient development and deployment.
  • Leverage AI agents and tools (e.g., Co-Pilot) to enhance productivity, code quality, and problem-solving.

Want more jobs like this?

Get jobs in Hyderabad, India delivered to your inbox every week.

Job alert subscription

Required qualifications, capabilities, and skills

  • Formal training or certification on software engineering concepts and 3+ years applied experience
  • Hands-on experience with AWS Glue (ETL jobs, crawlers, workflows), including linking data to downstream tables.
  • Advanced skills in writing and optimizing queries and calculations on large datasets.
  • Experience with PySpark and distributed data processing & Strong programming skills in Python.
  • Familiarity with AWS services: S3, Lambda, Redis, Athena, Step Functions.
  • Experience with version control (Git) and CI/CD pipelines.
  • Hands-on experience with AWS services and cloud-based data solutions & Strong hands-on coding capabilities in languages/platforms such as PySpark/Python/Databricks.
  • Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.).
  • Ability to leverage AI agents and tools, such as Co-Pilot, to enhance productivity and code quality.

Preferred qualifications, capabilities, and skills

  • UI/UX - React
  • Proficiency in all other AWS components - preferably AWS certified.
  • Experience integrating AI/ML models (SageMaker or custom models) into data pipelines is a plus.

Client-provided location(s): Hyderabad, India
Job ID: JPMorgan-210691887
Employment Type: FULL_TIME
Posted: 2025-12-23T19:04:38

Perks and Benefits

  • Health and Wellness

    • Parental Benefits

      • Work Flexibility

        • Office Life and Perks

          • Vacation and Time Off

            • Financial and Retirement

              • Professional Development

                • Diversity and Inclusion