Parrot Analytics

Senior Software Engineer - API & Data Platform

3+ months agoAuckland, New Zealand

We love TV. We believe in the magic of content, and its impact on people's lives.

We are a data science company that empowers media companies, brands and agencies to understand global audience demand for television content. Wielding the world's largest audience behaviour data sets, the company has developed the world's only global cross-platform, country-specific Content Demand Measurement System.

About the Role:

As a Data Platform - Senior Software Engineer, you will build and maintain critical platform and infrastructure predominantly built on AWS cloud to enable our team to develop the next generation of product modules and features that define cutting edge innovation. This will enable us to continue to solve TV industry's biggest challenge : Accurate measurement of TV content in digital world

You will work collaboratively with a team of world class engineers, data scientists and industry insights analysts to understand what they need to get their jobs done. You care deeply about the accuracy, availability and security of our core data platform whilst being highly passionate about monitoring metrics and adapting new technologies relevant to platform and business.

You are an ambitious leader in your own right, a self-starter and an organized teammate who thrives in a fast growing, dynamic, results driven environment. If you want to be known for playing a key role in building & maintaining the technology behind the global entertainment industry's first ever measure of content popularity, then this role is the best fit for you.

We keep our culture at the core of what we do and why we do it. We trust highly and are looking forward to welcoming your contribution to the team.

Requirements

  • Empathy for customers, teammates, and other stakeholders
  • Solid foundation in computer science with 8+ years of solid software development experience
  • Highly confident in object oriented programming language(s) - We code primarily in Java/RxJava, Scala, Python, but we believe that engineers with deep fundamentals can pick up new languages relatively quickly.
  • Intensive exposure of serving data via RESTful APIs in core frameworks like SpringBoot, NodeJS or AWS Amplify/Lambda. Experience in building highly throughput applications in terms of number of calls and high volume of load processing is a MUST
  • Highly competent in SQL Databases (MySQL, PostgreSQL) : ORM frameworks and performance tuning techniques.
  • Experience working in cloud production environments such as GCP, AWS or Azure.
  • A hands on problem-solver who loves working as part of a team, strives for continuous improvement and is "fascinated by data" (curious for the extracting insights out of it).
  • Excellent team player with good communication skills - you need to be a great listener that understands the needs of the business and can convert these into clear and easily understood specifications & are able to present this across the teams

It will be advantageous to have:
    • Exposure to designing and/or consuming GraphQL APIs
    • AWS services experience in production environments around big data like CloudWatch Monitoring, Dynamo DB, ElasticSearch, Redis or Memcached.
    • Experience in building highly scalable architectures: designing, operating and supporting secure, high-performance, high-availability, large-scale, distributed systems
    • Understanding and exposure to Big Data technologies such as Apache Hadoop, Spark, Kinesis, Data Pipeline, Elastic Map Reduce(EMR), Athena, S3, EC2, and Auto Scaling
    • Production experience with Kubernetes