Can Moral Disputes
Response
The beginning of the article by Alex Rosenberg tainted the
rest of it for because I thought the author came off as being naïve. In the
second paragraph he states that Republicans and Democrats may different on
policy but share goals such as a healthy American population. I disagree. While
some politicians probably govern out of a desire to help society, especially at
a local level, I believe a large portion, maybe even a majority portion, do it
for more selfish reasons – power, recognition, vindication, etc. So much of
politics can appear to be about what’s in it for the politician and the people
they know, versus what is best for the general American population.
By the time I got to the end of the article, I thought, “So
what?” To the people who believe that honor killing is justified, I don’t think
you can find “objective ground”, so if the objection is an “emotional response”
or “cultural prejudice”, it doesn’t matter. There needs to be some other
approach to try and prevent or decrease honor killings, if that is the goal.
Self-Driving Cars
Programmed to Kill Response
It is interesting that economic researchers are the people
who led the research about the general public’s reaction to a self-driving car
programmed to sacrifice the owner. I had to wonder if the study actually gauged
the perceptions of the “general public”. The article said they asked workers
through Amazon’s Mechanical Turk. I wish they would have asked a broader slice
of the population. The fact that they asked people through their employer might
skew the answers: a person outside the workplace might be more likely to admit
they would not go along with a specific scenario if it meant sacrificing
themselves to save others.
Men and Women
Different Scale Response
The dilemma presented in this story was one of the few
presented in this week’s reading where I had a definite answer: I would not
kill this young Adolf Hitler who hasn’t “done anything wrong yet”. I would try
and use this time to find another way to prevent his actions.
Self-Driving Car
Articles Connection to Outside Material
A self-driving car dilemma based on real life: Would
You Trust a Self-Driving Car in a Snowstorm? Snowstorm could easily be
replaced with high humidity, intense rain, tornado, dust storm, all depending
on what part of the country or world you are living. I know the articles for
this week are about moral dilemmas of how they should be programmed or whether
they should artificially learn, but it seems like we still have some practical,
mechanical things to sort out, even if these self-driving cars are on their way
and we are not ready for them, as suggested by this
article in The New York Times.
Self-Driving Car
Programmed to Kill You to Save Others Article Connection to Men and Women
Different Scales Article
When I read the self-driving car article, I thought I fit
into the utilitarianism group and not the deontology group. When I read the
article on men and women using different scales in moral dilemmas article, I
realized, at least in this scenario, I was a deontologist. I was not motivated by consequences of saving
the greatest number of people, but instead on the act of killing being wrong,
even in the murdered would be Adolf Hitler.
No comments:
Post a Comment