Senior Data Engineer

1 month agoVancouver, Canada

About Skillz:

Skillz is the leading mobile games platform connecting players in fair, fun, and meaningful competition. 

The gaming industry is larger than movies, music, and books, with more than 2.7 billion gamers playing monthly and 10 million developers worldwide. Mobile is the fastest-growing segment of the gaming market, expected to increase from $86 billion last year to $161 billion in 2025. 

As the first publicly traded (NYSE: SKLZ) mobile esports platform, Skillz has pioneered the future of the gaming industry. The Skillz platform helps developers build multi-million dollar franchises by enabling social competition in their games. Leveraging its patented technology, Skillz hosts billions of casual esports tournaments for millions of mobile players worldwide, and distributes millions in prizes each month.

Through its philanthropic initiatives, Skillz has harnessed the power of its platform to transform the way nonprofits engage with donors, enabling anyone with a mobile device to support causes such as the American Red Cross, Susan G. Komen, American Cancer Society, and NAACP by playing in Skillz tournaments.

Skillz has also earned recognition as one of San Francisco Business Times’ Best Places to Work, Fast Company’s Most Innovative Companies,’s Best Companies for Women to Advance, a two-time winner of CNBC’s Disruptor 50, one of Forbes’ Next Billion-Dollar Startups, and the #1 fastest-growing company in America on the Inc. 5000.


What you'll do:

Develop new data systems for Skillz’s online platform as well as supporting the data science team in developing and deploying new algorithms for matchmaking and fraud and cheat detection.

Duties include:

  • Develop new systems to provide real-time streaming analytics and event processing pipeline based on fast data architecture to handle throughput over millions of events per second.
  • Develop enterprise grade data lake to support both business analytical needs and next generation data infrastructure.
  • Develop data integration toolkit to build and manage automated and efficient data pipelines that will consume data from backend services into a data repository.
  • Support Skillz’s data science team in developing and deploying new algorithms for matchmaking, fraud and cheat detection.
  • Research technical solutions to move large data sets from a variety of sources to formats consumable by reporting systems and analysts.
  • Create infrastructure for and enable monitoring and alarms pertaining to data integrity and data systems health.
  • Leverage and contribute to industry best practices including proper use of source control, code reviews, data validation and testing
  • Support Skillz’s product development team in creating new events to measure/track business growth.
  • Create and implement engineering best practices and collaborate in the design of effective streamlined processes with partner teams.


Position Requirements: 

  • 5+ years of experience in data management, data engineering and/or software engineering working with Python, Scala or Java and Cloud technologies (AWS, GCP or Azure)
  • AWS data products (Data pipelines, Athena, Pinpoint, S3, etc)
  • Experience deploying data infrastructure
  • Experience with recognized industry patterns, methodologies, and techniques


Nice To Have Experience:

  • Experience in designing, implementing, automating and maintenance of large-scale ETL processes, providing proficiency with ETL concepts like idempotency, retry and backoff strategies, data parsing techniques, invalid data handling, data staging, code reuse.
  • Experience working with RDBMS technologies including MySQL.
  • Experience with data warehousing technologies (SQL Server, Snowflake and/or Redshift).
  • Experience building and managing pipeline authoring tools and running active data pipelines in production using Airflow.
  • Experience in developing containerized applications and hosting data systems in container orchestration platforms like Kubernetes.
  • Experience with event streaming and transformation from source to destination using distributed streaming systems Apache Nifi or Apache Flink.
  • Experience in ANSI SQL, and with at least two of the following: Python, Java, Scala.
  • Experience in rapidly prototyping a data product from scratch and leading the implementation in production.
  • Experience with managing outside of business hours incidents and acting on a production environment autonomously to address critical issues.


Job Site: Portland, OR, Las Vegas, NV, Vancouver, BC, Los Angeles, CA, Dallas, TX, Austin, TX, Phoenix, AZ

Please apply via Skillz, Inc.’s career website at This position is part of Skillz, Inc.’s employee referral program and is eligible for an employee referral incentive.

Job ID: 2164713