Data Engineer

About Us

We all love what we do, and we all love doing it together, because we demand the best people to make the best Venmo.

Venmo was founded on the principles of breaking down the foreboding walls of financial transactions to make them intuitive and even fun with friends. And it worked, people love sending money with Venmo.

But we’re not done. We want to take that magic of sending money with Venmo and cascade it into every place you use your money. We want to connect the people of the world with their money, in an intuitive way, then connect them with each other in a genuine way.

All that’s going to take a lot of figuring out. Let’s figure it out together.

Engineering at Venmo

At Venmo, we are creating a product that people love. We strive to create a delightful user experience while connecting the world and empowering people through payments.  We are looking for intellectually curious people who want to be inspired and inspire others to change the world.

Engineering is a craft, and at Venmo we want the internals of our software to be as elegant as the end user experience we are designing.  We spend our days scaling our infrastructure and building new features to meet and exceed our user’s needs and wants. We teach and learn from one another, and push each other to be at our creative and analytical bests.

Data Engineer

As a Data Engineer at Venmo, you’ll be part of the Data & Analytics team, helping us leverage our rich and interesting dataset to make smarter decisions. You’ll build out ETL pipelines, interface with various SQL & NoSQL databases, and work on a modern stack. In this role you’ll be part of a forward-thinking team, learn a lot about user behavior and Venmo as a business, and have an impact on business and production decisions.

If you love solving problems at scale, designing elegant data models, implementing resilient data pipelines, and making an impact through data, come join our team!

Things you'll do as a Data Engineer at Venmo:

  • Work on next-gen data pipelining and ETL frameworks based on Luigi, Python and AWS
  • Communicate with business users to solve problems and build and automate custom reports
  • Develop and improve our internal dashboarding tool, Looker, and evangelize data across the company
  • Develop a framework for data quality and outlier detection
  • Understand the mechanics of Amazon Redshift and maintain our database cluster

Experience

  • 2+ years experience in data engineering or a related field
  • Bachelor's and/or Master’s in Computer Science OR related field of study
  • Fluency in SQL and python or other scripting languages
  • Experience with Luigi, Airflow, or other ETL pipelines a big plus
  • Passion for data-driven decision making
  • Strong communication skills with the ability to understand and explain technical issues to a non-technical audience

 


Back to top