Or is it just a good way to market their product?
A lot of big promises have been made about self-driving cars. If you listen to their boosters, the technology is poised to eliminate traffic, end car ownership, get rid of parking garages, address the carbon footprint of transportation, give us more time to work or use our phones — and those are just the ones at the top of my mind.
But biggest of all has to be the promise to save approximately 40,000 lives lost every year in vehicle crashes and collisions across the United States. The claim is always in the marketing issued by the companies working on autonomous vehicles, but can the claims of executives at Uber, Waymo, Tesla, GM, Ford, and others that their technologies will achieve this goal really be believed?
Putting aside the fact that self-driving cars are much further off than these companies led us to believe six to ten months ago — they’re all slowly admitting that the technology will take a decade, or several, to perfect — their actions beg the question of whether their commitment to saving lives is genuine, or simply a good way to promote the product they’re trying to sell to the masses (and make exorbitant profits along the way).
Waymo’s Risky Early Days
How might the development of self-driving cars take place in a way that would put safety and protecting human life as the top priority?
Waymo, the Google sister company, has been held up as the leader in the space. It’s operating a very limited public service in Arizona that only launched after the company put safety drivers back in every vehicle and the CEO admitted a self-driving vehicle capable of driving in any condition would never exist. The test vehicles have recorded a number of crashes, but only recently did details of the early days of the program come to light that seriously question the company’s commitment to safety.
Anthony Levandowski, who led Google’s early efforts with self-driving cars and later moved to Uber, told the New Yorker that “[i]f it is your job to advance technology, safety cannot be your №1 concern” — a value he demonstrated at Google, under the protection of co-founder Larry Page. While heading the self-driving team, Levandowski would alter the vehicles’ software so he could take them on forbidden routes; in one incident, the Google vehicle boxed in a Toyota Camry, eventually sending the Camry off the road and into a median, and causing Levandowski’s passenger “to injure his spine so severely that he eventually required multiple surgeries.” They left the scene and never checked to see if whoever was in the Camry was okay — and that’s just one example of what could have been many similar incidents.