Software Engineer, Payments Analytics - Apple Media Products

    • Cupertino, CA


Posted: Jun 10, 2020

Weekly Hours: 40

Role Number: 200167105

Imagine what you could do here. At Apple, new ideas have a way of becoming phenomenal products, services, and customer experiences very quickly. Bring passion and dedication to your job and there's no telling what you could accomplish. Are you interested in a career in data? Data drives the direction and strategy for Apple's growing Services business. As the Payments Analytics Engineering team we collect, curate, and provide insights into payments data from Apple's services, Online Store, and Retail organizations. This data plays a meaningful role in enabling business growth. We build data pipelines with maximum efficiency, scalability and reliability to allow domain specific engineers to focus on their specialties. Are you passionate about presenting data effectively? Curious how data is manipulated in real-time? Want to utilize ground breaking machine learning techniques to detect anomalies in a sea of data? This is the team to join. You'll be exposed to modern, open-source technologies that are standard to the big data industry, and will work with data at a scale that few organizations in the world have access to. Our data is used to provide a meaningful customer experience while buying Apple products and to optimize various payment methods for Apple. We've even built data products and visualizations that are consumed by external payments partners for Apple.

Key Qualifications

  • Show strength in data management and automation on Spark, Hadoop, and HDFS environments
  • Experience in crafting and developing large scale real-time streaming pipelines using Kafka, Spark Streaming or Flume
  • Experience in building and deploying large scale applications in Cloud based environment
  • Experience handling data in relational databases and developing ETL pipelines
  • Proficiency in Java & Spring/ Springboot FW's and other JVM languages like Scala or/and C#, or similar object-oriented language
  • Deep understanding, and strong hands on experience - Multithreading, Networking (including non-blocking IO), etc
  • Confidence with SQL databases like Oracle and NoSQL databases like Cassandra
  • Experience driving product features, functional specifications, and development schedules, represent team and technology
  • Passion and prior experience designing and implementing outstanding large distributed systems
  • Be an advocate and driver for performance optimization, automation, and unit tests
  • Ability to pick up new technologies quickly
  • Excellent debugging, critical thinking, and interpersonal skills
  • Dedicated attention to detail
  • Tried documentation and technical writing skills


This Payments Analytics team is in charge of collecting, analyzing, and reporting on Payments, Apple Pay/Card and Gift Cards data. From this data we generate insights into how customers interact with Payments products and services, and use these insights to drive improvements to user-facing features. You will be working with a dynamic team valuing cooperation, brainstorming, with an emphasis on optimized design. You are accountable for developing systems, tools, and visualizations to make sense of the data. We are looking for a sharp engineer who also has a keen sense of how to build quality and scalable products. You are also a teammate -- ready to engage in lively design discussions, and able to give and receive constructive code reviews. Your curiosity drives you to explore new technologies and apply creative solutions to problems. The ideal candidate pays close attention to details, but also keeps sight of the bigger picture. We're a diverse collection of thinkers and doers, continually reimagining our products, systems, and practices to help people do what they love in new ways. This is a deeply reciprocal place, where everything we build is the result of people in different roles and teams working together to make each other's ideas stronger. That same real passion for innovation that goes into our products also applies to our practices, strengthening our dedication to leave the world better than we found it.

Education & Experience

MS or BS degree in Computer Science or a related field

Additional Requirements

  • Proficiency with source control systems (SVN, Git) and build tools such as Gradle, Maven, etc.
  • Confirmed and passionate experience with the Big-Data ecosystem (Spark, Hadoop, Hive, Pig, etc.)
  • Built and deployed large scale data pipelines (e.g. Kafka, Spark, Storm)
  • Understands different data storage solutions and when to use them (e.g. RDBMS, Cassandra, Solr, Redis)
  • Experience implementing and administering logging, telemetry and monitoring tools like Splunk is a plus
  • Experience in cluster management/orchestration software like Aurora or Ansible & using tools such as Docker is a plus
  • Passionate about being a part of a tight-knit fast moving Big data team

Back to top