Bill Ackman’s SPAC Gets Sued – The New York Times

Bill Ackman’s SPAC Gets Sued – The New York Times

Yesterday, the federal government’s top auto-safety agency announced the broadest investigation yet into Tesla’s assisted-driving technology. Neal Boudette, who covers the auto industry for The Times, explains why fully self-driving cars, which not that long ago seemed to be just around the corner, now appear further away.

In 2016, Ford promised that it would be producing a car with no pedals and no steering wheel by 2021. Waymo, the autonomous car company owned by Google’s parent, Alphabet, has been testing a driverless ride service in the Phoenix area since 2017. And just two years ago, Elon Musk said that a million Tesla robo-taxis would soon roam the streets of American cities. Tesla even sells an upgrade right now called Full Self-Driving.

But none of these projects have gone as expected. Ford has shifted its strategy. Waymo is still testing but remains years from a large-scale commercial service. Tesla hasn’t produced a single autonomous car and has quietly acknowledged to California regulators that its Full Self Driving isn’t capable of full self-driving.

So what happened? Developing a car that can drive with no help from a human is far, far harder than the auto industry’s top experts once thought.

It’s one thing to use cameras, radar and computer chips to make a car that can stay in its lane and keep a safe distance on a highway. But it’s a vastly greater challenge to teach a computerized system to safely deal with intersections, cross traffic and construction zones, as well as drivers, pedestrians and cyclists who only sometimes follow the rules of the road.

Autonomous systems still struggle to recognize unforeseen dangers — a car suddenly changing lanes, or parked somewhere unexpected — and then choose a safe response. While driver error causes the majority of the nearly 40,000 roadway deaths that occur yearly in the U.S. each year, humans still cope better with surprises.

The risks of allowing cars to drive themselves can be seen in the fatal accidents that have come to light recently. In one 2019 crash, a Tesla operating with the company’s Autopilot system engaged came to an intersection and slammed into a parked car, killing a woman standing nearby. That accident occurred not because Autopilot isn’t capable of autonomous driving, but because Autopilot failed at the most basic function of any safety system. It simply failed to recognize an object and stop before hitting it.

Related Posts