Halfway Through the 2030 Self-Driving Car Bet

April 16, 2026

In March 2022, software developers Jeff Atwood and John Carmack made a $10k charity bet on the following proposition:

By January 1st, 2030, completely autonomous self-driving cars meeting SAE J3016 level 5 will be commercially available for passenger use in major cities.

Carmack bet that we would have level 5 cars by 2030. This means cars that are fully autonomous and able to drive everywhere in all conditions.

Atwood bet against, writing:

To be clear, I am betting against because I think everyone is underestimating how difficult fully autonomous driving really is. I am by no means against self driving vehicles in any way!

Four years later, just over halfway through the bet, it looks like Atwood has a good chance of winning.

The Blackout

Driverless taxis currently operate at SAE level 4. Their service is restricted to specific areas within certain cities and they aren't fully autonomous. Whenever a Waymo robotaxi gets into a situation it can't figure out, the car requests "remote assistance", where a human call center agent advises the vehicle on how to proceed. There are roughly 40 cars to every human assistant. Half of these remote workers are located in the Philippines.

In December, a power outage in San Francisco caused Waymo's fleet of robotaxis to stop mid-traffic for hours, blocking roads and obstructing emergency vehicles. Waymo's post mortem indicated that while their cars can navigate through disabled traffic lights, they sometimes request confirmation before proceeding. During the power outage there was a backlog of confirmation requests which left many cars frozen in place.

This problem could have been much worse if there had been more driverless taxis on the road—there are currently only about 800 to 1,000 Waymos in operation in San Francisco.

"Edge Cases"

Self-driving vehicles continue to fail in numerous other ways.

Last September in Phoenix, Waymos got stuck in flooded streets. Of course, some human drivers may do this as well, but you can warn people about dangerous levels of flooding and most are able to avoid problem areas. Because of the nature of machine learning, you can't just tell the robotaxis to watch out for flooding. Waymo's solution was to temporarily suspend service of all cars in the Phoenix area.

In December, a Waymo drove into an active crime scene in LA. Several police cruisers had their lights flashing, police had weapons drawn, and officers were yelling at the Waymo to leave the area as it slowly drove within a few feet of a suspect who was lying on the ground.

In January, a Waymo drove down the light rail tracks in Phoenix while a light rail car was approaching. And in March, in two separate instances in Austin, a Waymo stopped within railroad crossing gates, just a few feet away from a passing train.

Last month in Austin, a Waymo got stuck after trying to make a u-turn and ended up blocking an ambulance that was responding to a mass shooting.

The cars have also gotten stuck driving in circles, taken unexplained detours into parking garages, and trapped passengers inside.

These types of situations are often referred to as "edge cases". An edge case is a set of extreme conditions which can cause unexpected failure in a system. This isn't a very fitting description though, since obeying police officers and not driving on train tracks are fundamental driving requirements. So is understanding what a school bus is.

School Bus Failures

Waymo has been struggling for months now trying to get its cars to stop illegally passing school buses that have a flashing stop sign extended.

The National Highway Traffic Safety Administration launched a probe into Waymo last October after one of its cars passed a stopped school bus in Atlanta. Waymo issued a recall and stated that it had repaired affected vehicles in November. But the problem persisted and as of early December there had been 20 similar incidents at the Austin Independent School District. The school district hosted a half-day event in mid-December to try to help Waymo collect data to fix the issue. But by mid-January at least four more school-bus-passing incidents had taken place. And another incident occurred in March.

There is also an open investigation for a separate incident in which a Waymo struck a child near an elementary school in January.

If a human driver continued to rack up repeat offenses of not stopping for school buses, their license would be suspended. This hasn't happened yet for Waymo, perhaps because people believe these cars will eventually be safer than humans.

Safety

Waymo claims to be "already making roads safer." Their data show a reduction in injury-causing crashes and fewer airbag deployments when compared to human drivers in similar driving areas.

But even though Waymo's cars have driven more than 100 million miles, that's not enough to make statistically meaningful conclusions about fatal crash rates. According to a RAND Corporation study:

... fully autonomous vehicles would have to be driven hundreds of millions of miles and sometimes hundreds of billions of miles to demonstrate their safety in terms of fatalities and injuries.

So far, most of the miles driven by AVs have been at low speeds on urban roads in moderate climates. Waymo has only recently started offering freeway service. Occupant fatalities are more likely at higher speeds, so a new edge case on freeways could quickly negate their safety gains.

Current safety numbers ignore all of the recent close calls. The next time a Waymo goes beyond a railroad crossing gate, it could derail a train. The next time a Waymo drives past a stopped school bus, it could run over a child. The next time there's a major power outage, Waymos stuck in traffic could prevent a fire truck from getting to a fire. If an event like this occurs, it would be tragic but unsurprising.

AVs also introduce unique dangers. Remote assistance is a potential attack vector. Waymo's remote human assistants aren't able to directly control vehicles, which limits risk. But other companies like Tesla allow remote control. Will malicious actors ever be able to hack the remote assist functionality? Or the over-the-air software updates?

The Future

I'm skeptical that level 5 self-driving cars will be available by 2030. There are currently too many unhandled situations that require remote assistance. Waymo is under multiple federal investigations. And a particularly bad accident could cause a major setback. It's hard to picture how we could possibly get to fully autonomous vehicles in less than four years.


References
Scrolling Up and Down Logo

Copyright Chad Schroeder © 2026