On Tuesday, the California Department of Motor Vehicles abruptly rescinded autonomous vehicle company Cruise’s license to operate driverless cars in San Francisco, “effective immediately.” This, to borrow Joe Biden’s inelegant phrasing, was a big f*****g deal, but it only grew bigger when the DMV revealed its rationale.
First, the DMV found that, after a Cruise vehicle ran over and pinned a San Francisco pedestrian who’d been struck by a hit-and-run human driver on Oct. 2, that Cruise vehicle went on to “perform a pullover maneuver” — and blithely dragged the woman it was unable to detect some 20 feet, at 7 mph. The DMV further claims that Cruise was not forthcoming about its car dragging a person — a person, again, that its vehicle was unable to detect — and failed to mention this or initially turn over the relevant video. The DMV alleges it only subsequently learned of this chilling detail “via discussion with another government agency.”
Cruise denies that it withheld information from investigators, claiming that DMV personnel were shown “the complete video multiple times” on Oct. 3 — and that the video it sent the state on Oct. 13 was material investigators had already viewed.
Regardless of how this “Rashomon” situation works out — and this should not stay a he-said, she-said for long — this is an extremely serious, and even dire, situation for Cruise.
“Cruise has a real problem here,” said Phil Koopman, a Carnegie Mellon University professor who’s been working on self-driving car safety issues for more than 25 years. Cruise’s vehicle “did not know it was on top of a pedestrian. That is a really big public safety issue. The vehicle did not know it was dragging a pedestrian down the street.”
But this was known in certain San Francisco circles, from the day of the incident. Knowledgeable sources tell Mission Local that a gruesome trail of blood was clearly visible on the pavement between where the woman was initially pinned by the Cruise vehicle and where the vehicle ended up.
Cruise, again, denies withholding information. But what facts we know do not look good. As far back as Oct. 6, Forbes ran a piece quoting Board President Aaron Peskin, who stated what first-responders had told him: That the woman who was struck on Oct. 2 had been dragged 20 feet by the Cruise vehicle, and this was the major source of her grievous injuries (the unidentified woman, who is purportedly in her 30s, remains in serious condition at San Francisco General Hospital).
The company refused to comment on this allegation for the Oct. 6 Forbes article. But the DMV revoked Cruise’s license, in large part, because its vehicle dragged the pedestrian. And, on Tuesday, Cruise admitted to it: “The AV detected a collision, bringing the vehicle to a stop,” Cruise wrote on its blog. The driverless car “then attempted to pull over to avoid causing further road safety issues, pulling the individual forward approximately 20 feet.”
And, while Cruise flatly denied that it misled investigators, the Chronicle wrote Tuesday that video, shown to its reporters by Cruise immediately after the crash, cut off after the self-driving car struck the pedestrian. Her subsequent dragging was not depicted, and no mention of this was made to the pair of reporters — or to other media outlets.
“They have known this since moments after it happened,” Peskin tells Mission Local. “They are really bad actors, and I wonder what criminal liability they have.”
The DMV and National Highway Traffic Safety Administration, which are conducting ongoing investigations, are not in the business of prosecuting criminal cases (though San Francisco’s DA is). It will fall upon the state and federal agencies to piece together what happened in the moments before and after the woman was struck and pinned by the Cruise vehicle.
But, autonomous vehicle experts tell us, there is no ostensibly good outcome for Cruise here. Only varying degrees of bad.
Since the summer, Cruise has been losing about $263 million every month. In the face of this massive cash burn, the company has aggressively pushed to expand, expand and expand — and suddenly being mandated to take a big step backward and put a human behind every car’s wheel was certainly neither predicted nor welcomed by an outfit that was already deep in the red.
“I think Cruise is in for a really rough year,” summed up Missy Cummings, an engineering professor at George Mason University and the director of its Mason Autonomy and Robotics Center.
One of the most salient details the forthcoming investigations may reveal, Cummings says, is just whose decision it was to attempt to pull over after the collision — the autonomous vehicle, or a human at a remote operation center? This is important to know. But, Cummings says, there is no “good” answer for the company.
This situation “is really going to highlight the problem with remote operations,” she says. “Why was a human not notified? And, if one was, why did the car perform in this way?”
For autonomous vehicles to operate safely, “there have to be humans in the loop somewhere,” Cummings continues. “There will always be problems, sometimes big, sometimes little, but humans have to weigh in. We need to start embracing this: I’m after the state of California and other states to start licensing these remote operators and drug-testing them. Maybe the car did try to call a human, and someone was away at the coffee machine; we won’t know until they do the investigation. Without better support infrastructure, you’re going to see this problem again and again.”
Cruise, again, is burning through money at a pace humbling “Brewster’s Millions” — and better support infrastructure is, evidently, not where it wishes to spend its dollars.
The DMV, in a terse statement, indicated Tuesday that it has given Cruise a checklist of what needs to be done in order for the company to get out of the penalty box. Neither the DMV nor Cruise was forthcoming in describing what that checklist calls for.
But, if it’s a lengthy and onerous process, Cruise’s parent company, General Motors, may have some difficult math to do.
“Certainly, there are scenarios in which this is a real problem for Cruise, and GM has to make real decisions on how long to support them,” Koopman says. “They’ve been very aggressive. They’ve been prioritizing scaling up over cautious deployment. That caught up with them. We’ll have to see what the next step is.”
“If Cruise has a problem, it does not mean Waymo does,” Koopman says. “People see ‘robotaxi’ and they think they’re all the same. But it’s not so. You don’t get safety by having a lot of sensors; you get safety by having good engineering.”
Cruise, both professors said, could come out ahead in the long run by using this very public rebuke to undertake the kind of introspection that was not inspired by the very public documentation of its vehicles interfering in emergency scenes.
“The best thing that could happen is, this wakes Cruise up,” sums up Koopman. “They could wake up and get their safety house in order without actually killing someone. But I read the company’s blog post, and it does not seem like they’re taking this seriously as a safety issue. They blame the other driver who hit the pedestrian, but don’t address the key safety issue: How come your car didn’t know it was on top of a person?”
Cummings says that this incident “will definitely be a Harvard Business School case study at some point. Coming out of the chute and blaming everyone else and pointing fingers at everyone else is not the way to do it. It turns out the bulk of the woman’s injuries were caused by the AV. When will Cruise stop pointing the finger at everyone else?”
This is a question to which San Franciscans eagerly await an answer.