Close Menu
Tampa Injury Lawyer > Blog > Car Accidents > The Moral and Legal Ethics of Self-Driving Cars

The Moral and Legal Ethics of Self-Driving Cars

Autopilot2

News about self-driving cars is seemingly everywhere. As the technology evolves, not only are companies selling cars starting to make cars autonomous, but even shared ride services like Uber and Lyft are starting to look into how these cars can make life easier (or more profitable).

But for every bit of news about a technological breakthrough when it comes to self-driving cars is news about accidents involving these vehicles. Certainly the technology is not ready for mass usage just yet, but these cars are not only creating technological issues, but moral and legal questions as well.

Should Autonomous Cars Cause Car Accidents?

One such question is whether we want a self driving car to purposely cause an accident, or to purposely hit a pedestrian. This may sound like an odd question. But just as humans must make immediate moral and ethical decisions in emergencies behind the wheel, so must the computers that power autonomous cars.

Take for example a self driving car that is going along its way on the road, when kids at play jump into the middle of the road. There is not enough time to safely apply the breaks. Just as a person would, the computer must make an instant decision whether to hit the kids, or whether to veer off the road, potentially injuring the car’s occupants, or people that may be on the adjacent sidewalk (or worse, at an outdoor café or other commercially populated area).

The Right Decisions Are Not Always Clear

The answer may sound simple—the car should do whatever people would do, if the car was a normal one. Most people would say that the car should make a decision that protects or saves the most amount of people. However, when people are asked if they would actually buy a self driving car that is programmed in the event of an emergency to purposely crash the vehicle and possibly injure the occupants, most people say no.

A car also can’t make judgments about avoidability. Legally, if a pedestrian is able to avoid an accident, but does not, the pedestrian is at least partially at fault for the accident if he or she is hit by a car. But an autonomous vehicle again cannot make judgments about whether a pedestrian could have or should have avoided an accident. The car may simply determine that the occupants inside the car will be safer and the pedestrian will be spared, if the car veers off the road.

If you are in a car accident, make sure you get legal help as soon as possible. Tampa injury law attorneys at The Pawlowski//Mastrilli Law Group can answer your questions and help you obtain damages for injuries sustained in a car accident.

Resources:

popularmechanics.com/cars/a21492/the-self-driving-dilemma/

npr.org/2018/10/26/660775910/should-self-driving-cars-have-ethics

Facebook Twitter LinkedIn
Skip to content