Pilots usually have to understand their autonomous planes. We should understand our autonomous cars.
Recently, I asked my colleagues if they had ever been startled by a robotic driving feature. One described the unpleasant sensation of automatic braking in an Infiniti Q50 Hybrid, which would suddenly slow her down in front of a tight turn, bringing her close to getting rear-ended by the New Jersey drivers behind her. Another had been jolted by the aggressive beeping of his Hyundai rental as it warned him not to stray into a nearby lane. The father of a third routinely started his Toyota Camry with the key fob inside the house, drove to work, and found he couldn’t start the car when it was time to drive home. Some 40 percent of American drivers have had a similar experience with what the industry calls “advanced driver assistance systems,” or ADAS, according to a survey by Daniel McGehee, director of the National Advanced Driving Simulator at the University of Iowa.
Airplane pilots, who have been working closely with self-driving software for nearly four decades, have a phrase for such incidents: autonomous surprises.
You may like to read $80 billion dollars spent on self-driving cars with nothing to show for it article on the treehugger website.
The Air France Flight 447 crash was caused by the autopilot that stopped working due to the airspeed sensor icing,the air investigation final report concluded that due to lack of the pilots having practical training at high altitude and in the event of anomalies in the air speed indicator along with the startle effect caused the crash.