What happens next?

Predicting what is about to happen is one of the toughest challenges to solve in autonomous transportation. Around the next bend, you might encounter a pothole, or a pickup truck with long pieces of timber hanging off the back, or a semi-trailer stuck in an intersection.

Some scenarios are common. Others will only come up very occasionally. It’s important that every driver on the road is ready to react appropriately to both. Zoox is no different.

We have two fundamental tools that help us prepare our autonomous vehicle for real-world driving scenarios: simulation and structured testing. Together, they form an iterative loop, enabling us to test and refine how our vehicle responds to just about everything.

Our structured testing process

Scenarios to test

Simulation is powerful because it’s fast and flexible. It allows us to exercise parts of the software stack in a highly scalable fashion very early in the development process, and it exposes potential safety risks before they can manifest in the real world. Engineers at Zoox use a web editor tool to create and then simulate countless driving scenarios.

Those scenarios generally fall into three categories:

  1. Engineered scenarios. Scenarios created by our engineers that we think could tell us something useful about our vehicle’s behavior.
  2. Log-based scenarios. Particular situations that our software may struggle with, based on data from our L3 test fleet that is currently driving autonomously around SF and Las Vegas.
  3. System-generated scenarios. We also have systems that help us procedurally discover and test novel scenarios that our engineers may not have thought of or our vehicle logs may not have encountered.

These scenarios cover a vast range of driving situations, from high-speed oncoming traffic to busy parking lots, burst fire hydrants, and garbage trucks scattering debris in the road. They are fed into our Simulation team, who create and run the actual simulations.

Some of our Perception and Simulation crew at work.

Some of our Perception and Simulation crew at work.

Simulation gives us a glimpse of how our vehicle will perform in the real world—but it can never tell us exactly what will happen. That brings us to the next step in the process: vigorous structured testing on our private test track.

Testing on our track

Extensive structured testing is essential to delivering on our safety mission. That testing happens at Altamont, our private test track.

Our facility has three main test surfaces. The test track itself allows us to test how the vehicle behaves at higher speeds. There’s also the inner loop, which features a series of intersections that reflect typical city streets, and a test pad. The test pad is useful for scenarios that require space to explore safely—for example, tests that involve a lot of lateral movement.

A bird's-eye view of our private track.

A bird's-eye view of our private track.

Learning together

As you might imagine, operating an autonomous vehicle safely during a test comes with a lot of complexity. This process is led by our AlphaOps team. We’ll typically have a software operator monitoring vehicle health and AI behavior from inside the vehicle, as well as additional external operators who can bring the vehicle to a safe stop if they need to.

Each structured test presents a treasure trove of information. We can evaluate whether the vehicle behaves as expected, relative to both our simulations and our design and engineering intent. But we measure other things too: how the vehicle performs under acceleration or deceleration, how close it comes to other vehicles and obstacles, how it is affected by weather conditions, and so on. All of this data is collected and analyzed by our System Design and Mission Assurance (SDMA) team.

This information helps us define further simulations and, in turn, further structured tests. It’s an iterative loop that enables us to improve our software and, ultimately, to get our vehicle ready for public roads.

The process requires deep collaboration between different teams at Zoox. Our Simulation team works with engineers on our Perception team to define scenarios. Our Program Management Office (PMO) team schedules track tests and organizes logistics. The SDMA team runs our track tests and analyzes the results, while AlphaOps operates the test vehicles. Everything we learn is fed back to our software teams, who iterate their code accordingly; new code is carefully checked by Quality Assurance before it’s deployed in the next version of the vehicle software. Then, the process begins again.

If this process seems meticulous, there’s a good reason for that. Safety is foundational to all that we do at Zoox, and when it comes to safety, there are no shortcuts. Our structured testing process has been carefully designed to set a new bar in safety—for our crew today, and for our riders tomorrow.

Prepping the soft car for the track.

Prepping the soft car for the track.