“Can moral disputes be resolved?”
I like how this article contributed to what we read in last week's readings, by talking about the role of evolutionary biology and significance of meta-ethics. It's weird to think about Darwinism as it relates to morals (because "survival of the fittest" feels like an irrational concept, a phenomenon not based in thought at all) and I actually wouldn't mind looking into it more.
Because morals and ethics are such difficult topics upon which we seem to be able to reach no widely accepted consensus, I find it exhaustive to think about as much as we already have in this class (even though it's only been two weeks!). But I found one of the final ideas introduced in this article one that made it more worthwhile: the idea that if we don't find a way to explain of moral objections on objective ground then it may chalk up our stances on issues to cultural prejudice.
"Should your self-driving car be
programmed to kill if it means saving more strangers?” & “Why self-driving cars must be programmed to kill”
I felt a little indifferent as I read these articles because I can't imagine ever wanting/needing a self-driving car. I like the idea of cars that can park themselves (because I'm bad at parking), but I like to feel in control of my vehicle. I don't like to think of a world in which people accept self-driving cars because they're too lazy to do it themselves. I understand how it might be good for people with disabilities, but that's about it. I'd still rather just have more widespread and affordable public transportation become available in America like it is in other countries.
“The robot’s dilemma: working out how to
build ethical robots is one of the thorniest challenges in artificial
intelligence”
It’s amazing
that science fiction writers have been able to almost predict what would happen
in the future with robotic technology. One thing I thought about as I read this article is how cool it would be if a robot could be
created that could perhaps discover an infallible algorithm to help determine the best choices to be made in ethical or moral dilemmas.
Artificial intelligence is a scary endeavor, one that I'm glad most people agree we have to tread upon with caution.
“9 Moral dilemmas that will break your
heart”
As I went through this quiz, I recorded my answers a brief explanations if I felt it necessary to explain my rationale. I've included my responses.
Best Friend’s Wedding: Tell friend the fiancé is cheating (not a hard decision for me)
Robbin’ Hood: Say nothing (although, I can see where allowing crimes to happen could be a slippery slope)
Company Policy: (I
would tell the friend, but not share the private email)
A Sinking
Sensation: (I said I’d stay, but I can think of other alternatives b/w the two
choices)
The Accidental
Samaritan: (Confess, I can't imagine I'd be able to get away with it)
A Day at the
Beach: (Save niece first, can’t imagine not being able to swim with both
though)
The Spouse and
the Lover: (Save the lover, their life can be saved & more blame felt for spouse)
A Difficult
Decision: (Refuse to pull the chair, life in concentration camp probably as bad
as death & I don't feel like it would really be my fault for what psychopaths would do)
A Doctor’s
Dilemma: (Save the savable patient, I’d refuse to "play God" like that)
“Men and women use different scales to
weigh moral dilemmas”
I don’t agree with this study. I can’t speak for men, but I, speaking as a woman, do not identify with the results of the study. I don't know what to think about this, if I'm an outlier or the participants of the study are less educated or from a different culture.
I personally don’t
think I would kill Adolf Hitler, not because of an emotional, gut-level reaction (as the article suggests) but because the
world wouldn’t be the same in 2016 if there hadn’t been a WWII. What if it was worse? Fear of the
unknown, I guess is my rationale. If I was living in the 1920s and was asked
this question, I probably would kill him (provided I would not get caught, people would probably think I was a lunatic if nothing had happened yet).
I also don't agree with how the article says I would respond with the child pornography scenario: I would
not be responsible for allowing any child to be exploited to save others. I'd rather suffer nobly.
This article actually reminded me of an article I read on CNN this weekend about Syria's lost generation of girls being forced to marry at young ages to survive (http://edition.cnn.com/2016/01/19/world/cnnphotos-syrian-refugees-child-marriages/index.html). It's absolutely disgusting and intolerable in my opinion. The article's author expresses no outrage and I doubt the men involved are to be punished in any way by the international community for being perverts. It's so frustrating. (I realize I've demonstrated a strong emotional response here.)
“Mining, money, and environmental morals”
This article made sense to me. It's a lot easier for people with less stakes in any matter to feel the need to follow moral convictions. I'm sure if my and all those around me truly depended on mining for our livelihood, I wouldn't feel too bad if it was hurting the environment. I bet it's how some poachers in Africa think too. It's a sad situation in which people with less choice in the matter have to make difficult decisions to survive.
No comments:
Post a Comment