DevOps Engineer

PulsePoint Data Engineering team plays a key role in our technology company that’s experiencing exponential growth. Our data pipeline processes over 45 billion impressions a day (> 18TB of data, 160 TB uncompressed). This data is used to generate reports, update budgets, and drive our optimization engines. We do all this while running against extremely tight SLAs and provide stats and reports as close to real-time as possible.

The most exciting part about working at PulsePoint is the enormous potential for personal and professional growth. We are always seeking new and better tools to help us meet challenges such as adopting proven open-source technologies to make our data infrastructure more nimble, scalable and robust.  Some of the cutting-edge technologies we have recently implemented are Kafka, Spark Streaming and Mesos.  

Role description:

As a DevOps Engineer at PulsePoint, you'll support data infrastructure services, like Hadoop, RDBMS, Vertica, Kafka etc from installation and configuration to maintenance and curation. You will work closely with analysts, data scientists, and developers to make data processing transparent and provide data services that help drive and support business goals.  

Team Responsibilities:

  • Installation, upkeep, maintenance and monitoring of Kafka, Hadoop, Vertica, RDBMS
  • Ingest, validate and process internal & third party data
  • Create, maintain and monitor data flows in Hive, SQL and Vertica for consistency, accuracy and lag time
  • Develop, maintain a framework for jobs(primarily aggregate jobs in Hive) 
  • Create different consumers for data in Kafka such as camus for Hadoop, flume for Vertica and Spark Streaming for near time aggregation
  • Train Developers/Analysts on tools to pull data
  • Tool evaluation/selection/implementation
  • Backups/Retention/High Availability/Capacity Planning
  • Disaster Recovery- We have all our core data services in another Data Center for complete business continuity
  • Review/Approval - DDL for database, Hive Framework jobs and Spark Streaming to make sure it follows our standards
  • 24*7 On call rotation for Production support

Technologies We Use:

Chronos - for job scheduling

Docker - Packaged container image with all dependencies

Graphite/Beacon - for monitoring data flows

Hive - SQL data warehouse layer for data in HDFS

Impala- faster SQL layer on top of Hive

Kafka- distributed commit log storage 

Marathon – cluster wide init for Docker Containers

Mesos - Distributed cluster resource manager

Spark Streaming - Near time aggregation

SQL Server - Reliable OLTP RDBMS 

Sqoop - Import/Export data to RDBMS

Vertica - fast parallel data warehouse  

Required Skills:

  • 3+ Years experience supporting critical applications on Linux operating system preferably Red Hat, including OS installation and upgrade, package management, volume management, security auditing, and performance tuning including 2+ years with configuration management and containerization tools.
  • Proficiency in at least one scripting language ( e.g. python, perl, ruby, scala)
  • Familiarity with RDBMS, SQL; 
  • Proficiency with configuration management tools such as Ansible,Puppet or Chef
  • Experience with Docker and Mesos/kubernetes is a huge plus.
  • Passion for engineering and computer science around data - You must be self–driven, inquisitive and hungry to learn and improve.
  • Willingness to participate in 24x7 on-call rotation

What you’ll get:

  • Sane work hours (with flexible scheduling)
  • Competitive Salary & 401K Plan Match
  • Generous paid vacation (we consider your birthday a holiday)
  • Sabbatical at 5 years of employment
  • Health & Wellness Fairs
  • The opportunity to partake in our Office Fitness Shape-Up Program
  • Professional training and development budget
  • Annual Company Retreat
  • Complimentary membership to local programs like NYC CitiBike
  • Corporate Discount to New York Sports Club (NYSC)
  • Free team lunches twice a month
  • Team happy hours and beer-o-clock Fridays
  • Awesome snacks: drink bar, coffee bar, ice cream bar, candy bar & fruit bar
  • The opportunity to join our Company Sports Teams
  • Indoor dart wars, PAC-MAN, Basketball and Ping-Pong Tournaments


Meet Some of PulsePoint's Employees

Susie S.

Business Analyst/Data Science

Susie analyses enormous data stacks—over 20 TB a day—in order to determine optimal adjustments for PulsePoint clients and consult on new campaign strategies.

Lenny L.

Director, Exchange Team

Lenny ensures the PulsePoint advertising exchange functions quickly and efficiently. PulsePoint analyzes over 20 TB of data every day—over 100 billion ads each month—and Lenny keeps the system running 24/7.

Back to top