NEW HERE? USE "AFORUM20" TO GET GET 20 % OFF CLAIM OFFER

UK: +44 748 007-0908 USA: +1 917 810-5386
My Orders
Register
Order Now

Business and society

Many people believe that autonomous cars will save a lot of lives—tens of thousands every year in the U.S. alone. Does that fact reduce the need to think about rare ethical dilemmas where a few innocent people may be injured or harmed? That is, does the greater good or utility excuse any bad outcomes? Can you think of scenarios—whether about robot cars or anything else—where the greater good does not justify a bad action, such as a few wrongful deaths?