How Zoox Builds Autonomous Vehicles From the Wheels Up
Zoox has created its autonomous robotaxis with the idea of taking human error out of the experience of hailing a cab.
April 18, 2022
The model of individually owned, human-driven vehicles has never been able to eliminate human error. At Zoox, where I oversee strategic data analysis, my team is working to better understand and improve Zoox’s robotaxi performance and overcome that problem.
The facts support the need for a better model. In 2020, according to early estimates, about 40,000 people lost their lives on U.S. roadways. Globally, car crashes take the lives of about 1.3 million people every year, which is about 3,000 deaths a day, or roughly two per minute.
A Future with Robotaxis
Zoox’s approach is to completely reinvent personal transportation. Our vision is an autonomous, fully electric fleet of robotaxis, built for the rider and not for the driver. We’ll handle all the driving (at up to 75mph, but completely safely), charging, maintenance, and upgrades for the fleet. Users will simply use an app to hail the vehicle and provide their destination.
We’re hoping to achieve three things by completely reinventing personal transportation:
Create journeys that people can enjoy
Make the streets a safer place for everyone
Devise a world with fewer cars and less pollution, reducing the climate impact that humans and driving have on the planet
Zoox was one of the first companies to showcase a functioning, purpose-built taxi, and we’re on our way to delivering on our vision of an autonomous robotaxi service.
The Nuts and Bolts
The Zoox robotaxi is a new design from the wheels up. We took a blank slate to approach the problem and started by asking ourselves, “What is the best way to keep humans from driving?” The design we came up with has several innovative features, but I want to highlight just four of them:
A custom sensor pod
Big sliding doors
A very powerful onboard computer embedded with AI
Four-wheel steering gives the Zoox a very tight turning radius. The target market for robotaxis is urban environments, many of which have narrow streets and not a lot of space to maneuver. Four-wheel steering provides extra maneuverability, as well as more precision in following some trajectories so the vehicle can work through more narrow spaces.
The sensor pod has a unique architecture that combines cameras, radar, lidar, and long-wave infrared. All of that is custom, and we can place it on the vehicle wherever we like, because we created the vehicle design.
The vehicle is also equipped with a next-generation hardware design specifically to empower the AI driver on the vehicle. The Zoox robotaxi is also compact, which will help in navigating tight spaces.
Another feature is bidirectionality. The vehicle can pull into a spot and then pull right out, since there’s no real differentiation between front and back.
In designing the hardware, we’ve rethought what a vehicle can look like and how a robot interacts with the humans it encounters on the road, using both sound and light to communicate with other road users.
Keeping People Safe
Safety is essential in autonomous vehicles, and we take a proactive approach. More than 100 safety innovations have gone into the Zoox, including a novel airbag system. A Zoox has two seats on one side of the passenger compartment and two facing seats on the other side. This design precipitated a completely new airbag system that separates the passengers and keeps everyone protected in the event of a crash.
With our custom hardware, we can simplify some of the software design choices. We have overlapping fields of view for the sensors, resulting in redundancy around the vehicle. That means if one of the sensor pods fails, the vehicle will retain the ability to perceive the environment in full and can continue to drive safely.
The vehicle can see up to about 150 meters (164 yards) out and sometimes even around corners. And, of course, if we can predict a scenario occurring, we can react to it with more time and space than a traditional vehicle would have.
When people think of the AI stack, they think of the perception modules, the prediction modules, and the planning and control components of it, among many other things. There’s the calibration, localization, mapping, simulation, and data pipelines—a lot happens on the software side behind the scenes.
Help from a Tele-Operator
We developed something we call “tele-guidance,” which is our way of letting the AI system say, “I don't know how to handle this particular situation. I need some help.” The vehicle can place a call to human tele-operators, who can look at what the vehicle is seeing and suggest alternate paths for it to get itself unstuck from a situation.
And every time a Zoox calls tele-guidance, we can learn from that new experience and the AI stack can improve. As the AI stack improves, we will need tele-guidance less and less over the years.
Simulation is used quite a bit because it’s the best way to test changes to the AI. The capabilities we’ve added to the simulation stack allow us to test variations of scenarios at scale specifically.
We use a lot of simulation to validate and assert that the AI stack can perform within the safety margins that we’ve set for it.
The Big Picture
Over the next couple of years, we will continue testing our fleet in San Francisco, Seattle, and Las Vegas. We’ll also be scaling our simulation. At the heart of all of that is the data science and engineering team.
We believe the model of individually owned, human-driven vehicles is fundamentally broken and needs to be completely rethought. Our approach to this problem is to completely redesign the vehicle or the concept of a vehicle from the wheels up. Our vision is to provide a fleet of autonomous robotaxis for people to use in complete confidence about their safety.