Sunday, January 24, 2016

Week 3 Reading Responses

Self Driving Cars


In Mark Frauenfelder’s short article, Matt Windsor talks with bioethicist Ameen Bargh about a new variation on the Trolley question: Should your self-driving car be programmed to kill you in order to save others? Although ultimately I agree with the conclusion of the article that, no, it should not, this is an interesting dilemma. The example given of a choice between swerving into a concrete wall or into the path of an oncoming school-bus is particularly chilling. 

My brother is a professional truck driver. When they first start driving, all drivers have to go through a series of training courses that involve classes and time on the road with an experienced driver. During one of these classes, the instructor posed this question to the class: You’re on a small one-lane country road when a mini-van suddenly pulls out in front you from a side road. If you swerve, the truck will roll and there is a 90% chance it will kill you. If you hit them, you will kill everyone inside. Decide now what you will do when this happens.” Most people said they would swerve to avoid the van. I think this shows that a large number of people in the US do lean towards a view of utilitarianism. 

The problem with having the car programmed to make the decision for you is that, at least as of the present, technology lacks the ability to make decisions in the same way that humans do. A self-driving car would decide what to do based on a universal answer to the question with no room for interpretation. This is highly problematic because we live in a world full of shades of gray and a Kantian approach like this carried out by an emotionless computer is actually quite terrifying. There’s also the problem of technological glitches, viruses, and such. 

9 Moral Dilemma That Will Break Your Brain”


I also found this survey interesting in light of our discussion on the three stages of moral development. I looked at people’s answers to the questions, and it seemed like some showed a majority of people answering at what looks like the post-conventional level. For example, in the Robin Hood, question 79% of people wouldn’t turn in the person because they want the money to go to a good cause, even though this goes against the societal norm of not stealing and obeying the law. However, there could be some other factors at play here, such as the fact that this is an internet survey and people may be biasing their answers towards what they think people think they should say, not what they actually would do in real life.
 

 

No comments:

Post a Comment