Morality & Autonomous Vehicles
The industrial revolution has led to an increasing number of autonomous robots in our factories that perform various tedious tasks with precision and speed. This has helped tremendously bring down the cost of production of many things we use every day.
For instance, the cost of car manufacturing has gone over the years because the factories have been able to automate a lot of processes with robots. Painting a car used to be a painstakingly tedious process that used to take quite a bit of effort but now the car factories have robots that do a fantastic job in fraction of the time.
It was only a matter of time before we wanted to introduce automation to more complex tasks like driving a car. Until recent years, the idea was only confined to science fiction stories and books. Some car manufacturers like Toyota started venturing into this area with slightly simpler problems like parallel parking in the early 2000s.
Google in it’s bid to create more detailed maps with street views was the first one that seriously focused on fully autonomous cars and then the other manufacturers followed suit.
I have always been excited about this progression and can’t wait for the day before self-driving cars are all around us. I think it would benefit us in many ways including …
- Fewer car accidents
- Fewer traffic jams as these autonomous cars start talking to each other.
- Fewer cars on the road as cars aren’t stuck in parking spaces all day while people work and can act as Taxis
- Reduced fuel consumption and emissions
I believe, we would be able to solve the technical challenges in the years to come but there are some non-technical issues that we need to address before autonomous cars can become a wide-spread phenomenon.
Every time, you hit the brakes in your car to avoid a pedestrian or a dog on the street, you are essentially making a moral choice. Most situations thankfully aren’t morally complex but some of scenarios don’t really have black and white answers.
For instance, consider a situation where your car is going at the speed of 100 km/h and all of a sudden some drunk pedestrian decides to cross the highway because it’s a shortcut to their house. Your car is boxed between other cars and the only way to save the pedestrian is by ramming the car into the safety barrier of the highway, resulting in serious injuries to the passengers. What should the autonomous car do? Run over the pedestrian and save the passengers? Ram into the safety barrier, injure the passengers and save the pedestrian? This situation has no black and white answer.
Another complication arises out of the simple question that in case of a malfunction that results in injuries, who is held responsible? The owner of the vehicle? The car manufacturer? The self-driving software vendor? The sensor manufacturers?
These are the conversations that we have to have as a society before we can accept these autonomous cars into our daily lives. These are moral issues, that can’t be ignored and will directly impact the lives of our kids, so I think it’s a good idea to start having these tricky conversations with them.
The folks at MIT have put together a questionnaire that highlights some of the moral dilemmas that come with self-driving cars and can help with the conversation. Check it out.