High-speed crash due to red-light runner AND Cruise stopping inappropriately

Incident

On 8/18/2023 at 12:19am at 26th & Mission St, a Cruise robotaxi named “Shamrock” collided with a Charger driven by a human. Both vehicles were severely damaged. The Cruise had no passengers. One person in the Charger was injured but not taken to hospital.

Reported information

“Shamrock” detects approaching car and stops in its path

The video, picture, and initial report of the crash came from Eleni Balakrishnan, a Mission Local reporter, on Xitter. It was Eleni who obtained the only visual evidence we have of the crash by getting a copy of the surveillance footage. We appreciate our local reporters!

Also of note is that there was no passenger in the Cruise vehicle. It was simply putting on miles without providing service.

Why didn’t Cruise provide more info?

Another frustrating aspect of Cruise running dangerous experiments on our city streets is that they never provide clear evidence of what went wrong. Sure, it appears the other car went through a red light, but Cruise should let everyone know what really happened and show the video from their car. They never created a blog post as they have done to explain other crashes, and they’ve never tweeted about this incident. We expect more transparency.

Apparently, Cruise made specific claims via Xitter to Eleni, the reporter who first reported the crash. But Cruise never provided clear information to the public:

Report to the DMV

But Cruise did actually report this crash to the DMV, where it can be looked up. The pertinent information is shown below:

A Cruise autonomous vehicle (“AV”), operating in driverless autonomous mode, was at a complete stop in response to a red light on eastbound 26th Street at the intersection with Mission Street. As the AV was proceeding after the light turned green, a Dodge Charger traveling southbound on Mission Street failed to stop for the red light, entered the intersection, and made contact with the AV, damaging the front and rear driver side and rear bumper fascia of the AV. Police and Emergency Medical Services (EMS) were called to the scene. The AV was towed from the scene. The passenger of the Dodge Charger was assessed by medical personnel at the scene and was not transported by EMS.

https://www.dmv.ca.gov/portal/file/cruise_08182023-pdf/

It is interesting that when Cruise reported the incident to the DMV they did not claim that the vehicle was speeding, but only that it ran the red light. It is disconcerting that Cruise says one thing on Xitter but then another to the DMV.

Cruise also did not mention to the DMV that the Cruise vehicle detected the other vehicle but stopped directly in the path red-light running vehicle instead of avoiding the collision by simply continuing to move forward.

The reporting of this incident clearly lacks in transparency.

What went wrong

The Cruise vehicle clearly detected that the Charger might intersect with its path and therefore took action. The security video shows that the Cruise acted quickly, and stopped suddenly. The possibility of a collision was obviously correctly identified. At that point the Cruise ADS should have determined best way to avoid the collision, which would have been to simply continue moving forward. Instead, the vehicle wrongly went into a “panic mode” where it stopped as quickly as possible, which actually maximized the chance of a dangerous collision.

Software update had recently been made to handle this exact situation

Disturbingly, Cruise had recently rolled out a software update to better handle red-light runners. But this touted change might have actually led to the crash by doing the most dangerous thing possible: stopping right in the path of the red-light runner. Instead of determining the safest course of action, which would have been to simply continue moving forward, the vehicle did what it was pre-programmed to do: stop suddenly.

  • Shipped STA-V v26 that further improves the AV’s response to more challenging scenarios where other drivers blow through intersections illegally at high speeds.
07.02 | Software Release as announced by Cruise on 8/2/2023

Having a serious crash immediately after a software update that was specifically intended to address this type of situation indicates that Cruise has serious software quality control problems. The software change did not achieve the primary goal of reducing dangerous collisions. Notably, Cruise did not mention the recent, failed, software update in their report to the DMV.

Additional report of similar problem

On 10/19/23 eminent robotics and AI expert, Rodney Brooks, experienced directly how Cruise vehicles can create a dangerous situation by stopping in an intersection.

…last Thursday night I had a moment where I experienced real fear, where for half a second I thought I might be involved in an extremely bad accident.

It was at night and we were crossing Divisadero, heading west, on Filbert. Left is a steep uphill few blocks on Divisadero. There was a car coming down the hill quite fast, as we crossed Divisadero. My Cruise, with nothing at all in front of it, braked hard, really hard, right in the middle of the intersection, harder than I had ever experienced a Cruise taxi braking. That brought us (me and my taxi) to almost a complete stop right in the path of the oncoming vehicle. Fortunately the other vehicle started to slow down and then the Cruise moved on out of its way.

This, above, is my recollection of what happened. When it braked hard a real pang of fear shot through my body. When I saw the car heading right at us a conscious version of that fear kicked in.

A human driver in that situation would mostly likely continue to drive and not brake at all. Braking was the best possible way to cause a collision. Not a good choice.

In previous accidents that have resulted in collisions Cruise vehicles have been at a stop. My interpretation, and I have no knowledge of whether this is true or not, was that rather than take the risk of hitting another vehicle while moving, the algorithms were set to freeze when there was an imminent collision, as better than running into someone else. A weird hard-wired trolley problem solution which does not protect the Cruise vehicle, but unfortunately for a rider does not protect them either. And in many cases increases the likelihood of a collision rather than reduces it.

Rodney Brooks – Autonomous Vehicles 2023, Part II

But wasn’t the human driver fully at fault?

Though Cruise never provided easily obtainable video evidence from the vehicle, it appears that the human driver was possibly speeding, most certainly ran a red light, and was legally at fault. But safety is not about determining legal fault. It is instead about harm reduction. We need collisions to be avoided, no matter who is at fault, and Cruise absolutely failed at that point. Our streets are complicated. Bad things happens. Defensive driving is key. If the Cruise car had a human driver, the driver would not have stopped in the intersection, causing the collision. Instead, the human driver would have continued forward, easily avoiding the collision.

The Cruise automatic driving system had multiple options. Only one option, stopping directly in the path of the other vehicle, was dangerous, yet that was the option it pursued. It is as if Cruise created their own interpretation of the Trolley Problem. Instead of making the least problematic/dangerous choice, it instead chose the most problematic one. The ultimate anti-pattern.

And the argument that robotaxis will make all of these traffic safety problems go away is simply not true. The human driver of the other car had the opportunity to take a robotaxi. Both Cruise and Waymo vehicles were readily available. And note that “Shamrock”, the Cruise car, was empty at the time. Ubers, Lyft, and taxis were also available. Plus BART runs directly under where the collision occurred and there are numerous Muni buses that run along Mission Street. Reckless human drivers typically have multiple options, but they choose to be reckless. The availability of robotaxis did not change the outcome, and it is the outcome that is most important.

The Untruths by Cruise

Cruise put the entire fault of the collision on the human driven vehicle. But the collision occurred because the Cruise vehicle detected the other vehicle and did exactly the wrong thing, stopping directly in front of the other vehicle. Cruise neglected to report to the DMV that the driving of the Cruise vehicle was an integral reason for the crash, an egregious lack of transparency. This lack of transparency is key because it means that the regulator, the DMV, could not make sure that this dangerous behavior was corrected.

Conclusion

Their “deer in the headlights“ algorithm of stopping in front of danger hasn’t worked out well for deer, and for this yet another Cruise crash, it doesn’t work for robotaxis either.

And while it appears that the human driver both was speeding and ran through the red light, being clearly legally at fault, the crash would not have occurred if Cruise vehicles had reasonable defensive driving capability.

The root problem was not reported to the regulators and was therefore never fixed.

“Deer in headlights” is a bad algorithm!

Leave a Reply

Your email address will not be published. Required fields are marked *