Create a free Manufacturing.net account to continue

When Autonomous Cars Kill Somebody

As drones, bipedal robots, and algorithm technologies continue to improve, the world of autonomous everything is looming. Perhaps looming isn’t the right word, but I feel compelled to set an ominous tone in order to provide an interesting conclusion.

As drones, bipedal robots, and algorithm technologies continue to improve, the world of autonomous everything is looming. Perhaps looming isn’t the right word, but I feel compelled to set an ominous tone in order to provide an interesting conclusion. Beyond the iPad, synchronized quad-copters, and even 3D printers, one of the world’s most powerful forms of emerging technology is the ability to make more machines and devices autonomous.

New technology, more often than not, is met with some trepidation; regardless of its usefulness or its ability to disrupt current methods (this is painting with a somewhat broad brush stroke, but if we go to the root of most new tech, it is accurate). See my article about an emerging wind turbine technology. Making machines autonomous has been a gradual process, but each new plateau seems to run into scrutiny (sometimes unwarranted). 

For instance, when automated manufacturing assembly lines first emerged as a viable replacement of some low-skill manual labor, there was backlash. Understandably so, machines were threatening to take jobs and many didn’t trust their soul-less replacements to do qualified work. That has changed to some extent, as automation has taken over a lot of assembly work. Jobs that pay $20 an hour to put a blue bolt in the blue hole have vanished.

One of the more recent autonomous concepts being pushed to the masses is the advent of cars that drive themselves. Even though I am immersed in the world of innovation and engineering, it still is somewhat mind-boggling to imagine cars that can operate driver-free, but it’s already a reality and the day we no longer handle the wheel could be closer than we think. 

A recent video from Reuters featured a car that the researchers at Carnegie Mellon said lays the groundwork for computers to replace humans as drivers. One reader’s comments ignited my concerns about autonomous vehicles.

A reader, by the screen name of Critic, said, “It's not so much a matter of developing the technology, and it's not so much a matter of getting drivers used to driver-less cars, and it's not so much a matter of developing infrastructure. The real issue is liability! Who will be responsible when there is a collision? The manufacturer? Drivers will blame their cars when collisions happen.” 

This instantly brings to mind the Google Driverless Car, which is responsible for many of the pictures taken for Google’s Street View feature. Where legal, this vehicle is completely autonomous, though there is a driver in place for supervision. Ironically, the only reported crash by the Google Car occurred when the human took over.

The human took over, making it easy to place blame and therefore liability, but what happens if and when the car’s sensors fail and it rear-ends somebody at a red light? Who’s responsible? The driver who was pleasantly reading the paper because he/she assumed the car was safe? I would hope not, as that defeats the purpose of having an autonomous car (in my opinion). If I have a car that can drive itself, I shouldn’t have to babysit, but I may be asking too much at that point. Perhaps insurance rates would rise slightly for those with self-driving cars. But, as of right now, they are safer than a human-driven car, so I feel as though higher rates would only slow progress and provide an unfair advantage to those who dig in their heels at change. 

This liability may very well be the only thing holding back a quick and fast adoption of autonomous cars. Beyond property damage on the road, this concept of liability can easily branch to other forms of self-driving vehicles, like drones. A big debate looms over the government's use of drones in both foreign and domestic territories (as there should be), but what if a drone spies on, injures, or even kills an innocent person? What if an autonomous car crash injures or kills somebody? Who is the responsible party?

These questions may be exactly what is keeping our cars from driving themselves, and keeping me from getting some serious reading done during my commute. An intriguing conundrum, to say the least. 

Who do you think is responsible when an autonomous vehicle or device crashes? Comment below or email [email protected].

More in Automotive