Technology advances are revolutionizing just about every aspect of our lives and they seem to come down the pike at break-neck speeds. In some cases, the benefits are immediately apparent. In others, the prospects are less clear. Still, cultural forces being what they are in the United States – political, economic and otherwise – we often press ahead without knowing exactly what we might encounter around the bend.
One of the most notable current trends in this regard is the drive toward autonomous cars and trucks. This spring, Georgia joined the ranks of states that have chosen to pave the way to allow unmanned vehicles to hit the road sooner, rather than later. Not fully answered at this point is who can be held responsible when an autonomous vehicle causes someone to be injured in a crash.
Question: How to program morality?
The new Georgia law does attempt to address concerns over liability. Operators of unmanned vehicles, whether cars or trucks, will have to register them with the state and minimum insurance requirements appear to be higher for them. However, some experts say developers of self-driving vehicles have another hurdle to clear – programming them to have a conscience.
Specifically, the question observers pose is, if these vehicles are confronted with a sudden life or death situation, how should they be programmed to respond? For example, if a driverless car with passengers is driving down the street and a child chases after a ball into the road, which life should take precedence? The same question might be posed in a situation where a driverless semitrailer truck is tooling down the freeway with cars around it and a deer dashes across the road.
Many advocates of autonomous vehicles tout the claim that they will be safer, reducing motorist injuries and deaths. However, some acknowledge that right now, self-driving vehicles only have potential to be safer and that there are significant issues that require resolution.