This collection left me mad as heck with readings 1, 3, 4, 5 and 6. But no worries: it was pretty much all wiped off the table with the second article. I'll cover the depressing stuff first.
Starting with The Atlantic's Facebook/numbers article: I've always known in the back of my mind that Facebook was using numbers and buttons to influence my behavior. I'm not too proud--or smart enough--to deny that the likes don't matter to me. I enjoy the validation. I want to see who cares about my posts. And now with the emoticons "reaction" buttons, well, bring it on, Facebook. But it's still shocking to read the article, to see all the facts about Facebook's outright intrusion on my life right before my eyes. We're being played. I feel like the numbers and likes are just a great big psychological experiment. And one day somebody (Zuckerberg, maybe?) is going to say, "Just kidding!"
The WSJ article about social bias in data was upsetting. I hadn't heard about the photo tagging disaster on Flickr. I'm back to the argument I've made all along: why are we trusting technology, software with tasks that technology and software really don't need to be managing? What in the world is the importance of the tags, really? It's one thing to want to add those for your (the user) own way-finding and memory-saving purposes. It's a whole other ball game to have software do it and make this kind of appalling error. It's embarrassing and altogether unnecessary. But here's the thing that's so hard to swallow: maybe it's our own bias that's really to blame?!? There goes my argument for letting the human manage his tagging himself. It's an ugly circle.
It was the same thought I had when reading about test prep companies charging based on geography, resulting in race-based price gouging. The people are failable, here, clearly. And it's because they're trusting the software implicitly, it appears.
The "Hypocrisy of the Internet Journalist" was one of the most dystopian pieces of...collegiate reading...ever. I felt defeated and demoralized and scandalized. But what am I going to do about it? What's the author's goal here? "You're screwed, but I don't have any answers for you (evil laugh). Have a good life though!"
In "Metrics have a powerful influence on journalists' morale," I thought: wow, conflict of interest. I'm now meant to worry about journalists getting high traffic to their stories to feel good? When and where will we stop feeding the need to validate our existence at every turn, with praise and encouraging words?
Reading "The Promise of Big Data," I felt hopeful all over again. On the one hand, obviously, I'm worrying about All The Data and how it's being used to predict my next trip to the bathroom. Next thing I know, I'm thinking, oh, this is great! All The Data might save us all. It's incredibly confusing. Good vs. evil, right vs. wrong, I tell you.
No comments:
Post a Comment