Software Engineer, Calibration (Autonomy)

At Lyft, community is what we are and it’s what we do. It’s what makes us different. To create the best ride for all, we start in our own community by creating an open, inclusive, and diverse organization where all team members are recognized for what they bring.

From day one, Lyft’s mission has been to improve people’s lives with the world’s best transportation. And self-driving cars are critical to that mission: they can make our streets safer, cities greener, and traffic a thing of the past. That’s why we started Level 5, our self-driving division, where we’re building a self-driving system to operate on the Lyft network.

Level 5 is looking for doers and creative problem solvers to join us in developing the leading self-driving system for ridesharing. Our team members come from diverse backgrounds and areas of expertise, and each has the opportunity to have an outsized influence on the future of our technology. Our world-class software and hardware experts work in brand new garages and labs in Palo Alto, California, and offices in London, England and Munich, Germany. And we’re moving at an incredible pace: we’re currently servicing employee rides in our test vehicles on the Lyft app. Learn more at

As an engineer in the Perception team, you will be responsible for ensuring all the sensors on our autonomous vehicle are calibrated. This will involve developing algorithms to estimate the intrinsics and extrinsics of cameras, IMUs, LiDARs and radars and use them to calibrate the entire fleet of AVs. You will be responsible for design calibration algorithm, develop robust calibration software, analyze calibration data, design and implement metrics, triage issues related to calibration, etc. For this position, we are looking for a software engineer with the ability to understand autonomous vehicles in general and strong level of expertise in computer vision, machine learning, 3D geometry and estimation theory.


  • Work closely with Localization, Mapping and the rest of Perception team to drive the requirements on sensor calibration.

  • Design core computer vision algorithms to estimate the calibration all sensors on our vehicles, including cameras, IMUs, LiDARs and radars in both controlled and natural environments.

  • Design and implement end-to-end data pipelines for sensor calibration, and the calibration software that gets deployed on all Lyft AVs.

  • Drive the understanding of calibration quality and their impact in downstream systems.


Experience & Skills:

  • Ability to produce production-quality C++ software.

  • Strong background in mathematics, linear algebra, numerical optimization, geometry, and statistics.

  • Bachelor's degree or higher in Computer Science, Electrical Engineering, Math, Physics or related fields.

  • Ability to work in a fast-paced environment and collaborate across teams and disciplines

  • Openness to new / different ideas. Ability to evaluate multiple approaches and choose the best one based on first principles.

Nice to Have:

  • 3+ years experience working in a related role.

  • 5+ years developing in C++ / Python.

  • Hands on experience with applying computer vision, machine learning or robotics theory to real products.

  • Experience with computer vision techniques like structure from motion, RANSAC, camera calibration, pose estimation, point cloud registration, etc.

  • Experience with deep learning techniques on images, lidar/radar point clouds, etc.

Back to top