Hhmmm... I hope I labeled every post...
"How numbers on Facebook change behavior" reminded me of the Bergen Facebook Addiction Scale I discovered in 2012. Here's a link to a 2015 article on the same topic:
http://www.medicalnewstoday.com/articles/245251.php
Personally, I experience a loathe-hate relationship with Facebook. I understand FB's value as a tool, but also view it as an unnecessary distraction. Typically, I don't log in until my Yahoo! inbox fills up with FB notifications and then I'm on long enough click the alerts off and log off. My pattern of use resulted in shock that people took metrics to heart and disgust (different article) that metrics would be posted in plain view. I suppose nothing should shock me anymore, but self-worth based on a number of likes? Really?
I don't have much to say on this one, but it seems to fit within patterns of any other addiction.
"The Promise of Big Data," presents a nice change of pace to the normal evils of big data collection. Wow. As a medical professional, and one who's been in tuberculosis prone regions, I enjoyed this reading. Approaching problems from new angles often yields unexpected results as described in the
TB and genetic analyses. Bravo to the scientists willing to break convention, but they're a minority. I hope the next generation of scientists really do embrace this method of data crunching, as the article suggests. Unfortunately, current research practices will require major overhauls for that to happen. Crowdsourcing analysis isn't a new idea. In the early days of public internet access (I think I was in high school), groups employed this approach from day one. I remember installing S.E.T.I.'s (Search for Extraterrestrial Life) screen saver on my family's computer. Computer techs at the community college installed the program in all the campus labs. The screen saver was actually a program that analyzed small pieces of radio telescope data searching for ET.
S.E.T.I.'s screensaver has advanced a bit in the past 20 years, but hasn't found any aliens yet. It's a big galaxy though, and maybe just a matter of time. My LG phone has many times the computing power of Apollo 11's systems after all. The point is, while Big Data helped advance some medical studies, we should't get too excited just yet. There's still the problems of programing algorithms, enlisting participants and choosing which data sets to tackle. Despite those issues, there's reason for optimism if not excitement.
"Pitfalls of Big Data" was disturbing in it's outline of discrimination. But was anyone really surprised? We're all targets for money-grubbing companies, and have been for so long that it's really a cliché. I'm sure the Princeton Review will win any law suits brought against them, or settle out of court. Of course automated pricing schemes are biased. Humans inherently harbor bias, even unconsciously, and that bias seeps into the projects. Who writes the programs? People. Who's biased? People. Who's under pressure to produce programs quickly? People.
I'm sure if someone were able to slow down the process, the price algorithm would have become a tad less discriminatory. But combining inherent discrimination with speed driven product only serves to promote these types of events.
Full disclosure about the WSJ article concerning social bias on the internet: I laughed. A lot. I agree that tagging African Americans as primates is NOT funny. It's the ridiculous results of poor programming that amuses me. Granted, my sense of humor leans dark.
Remember "Tay?" Microsoft built a chatbot for Twitter, then pulled it offline a day later because of how users twisted the program.
http://www.theverge.com/2016/3/24/11297050/tay-microsoft-chatbot-racist
Automated social bias on the internet is indicative of a much larger problem. When rushing to release a new digital application combines with inherent human bias and poor design input, offensive things happen. there's a systemic cultural ignorance at the heart of the bias. We need to overcome that. In this case it only caused emotional turmoil, not economic loss like Princeton Review's SAT prep or some other disaster. The lowest price was still over $6000 and in Dallas? I used a SAT vocabulary prep book given out free of charge. The print quality makes me think it only cost a few cents and thousands of trees to produce. If Dallas paid that much, I hate to think about the financial burden on Asian communities. There's still a silver lining. As socially unacceptable events like this go public, designers learn more about what criteria to examine. As the learning curve increases, social bias will hopefully decrease. The challenges presented by PR nightmares raise awareness and the determination to improve society.
On a personal note, I don't have an abundance of sympathy for veterans disqualified for jobs because software didn't understand their skills. I am empathetic because I share in this issue. The software isn't the problem. Think of military-ese as a foreign language, and imagine the job prospects of an employee submitting resumes in languages other than English in the U.S. It is a problem, but I wouldn't apply for a job in Mexico without a resume written in Spanish. The Army is trying to address this problem, at least since 2011 (when I transferred from active duty to the reserve). One of the out-processing steps requires soldiers to attend resume and job seeking classes. Instructors outline the the wording of military jobs into civilian language. There's also a lot of internet resources to help, like this one:
http://www.military.com/veteran-jobs/skills-translator/
By the way, Federal employment resumes are an entirely different story. There's a unique language required when applying for federal positions. I was given a book on the subject. Basically, the feds use computers to scan resumes before hiring managers even see potential employees.
I guess what I'm saying is there's an element of personal responsibility in the equation too. Some things, like offensive photo tagging, is out of the individual's control. But tailoring a resume for the specific job is most definitely within the scope of personal action.
"The Hypocrisy of the Internet Journalist" creeped me out. Less so the article, but rather while reading it my home networked printer suddenly turned on and performed some mysterious task. Nothing printed out (I would have included it in this post), but that's almost worse. Why would a powered down printer suddenly come to life with no explanation for itself?
As a Ghostery user, I'm aware of the data mining attacks resulting from simply opening a webpage. I applaud Norton for sticking to principles as best as she could, and understand why principles sometimes take a back burner to cash desperation. Hell, I've had jobs I've quit over similar issues, or never even applied for. I've had others where the desire for food money outweighed moral outrage.
I'm sure any journalist taking the time to list corporate spyware embedded in articles would find unemployment quickly. I'm glad Norton wrote this piece. There's plenty of stories reporting methods of data collection, but personal accounts resonate more with audiences. Society needs more attention grabbing stories like this one in order to act before something truly dreadful happens.
"Metrics have powerful influence on journalists' morale" finds me... not amused. Why the hell would anyone install a signboard displaying metrics in a newsroom? It's not like reporters have anything else to worry about like shrinking job opportunities, low pay for hard work, investor oversight, libel lawsuits or government prosecution. If people don't understand what the numbers mean, or the stats are poorly collected, what could does a sign board do? Of course it lowers morale. Maybe managerial staff should try interpreting the data first, or teaching employees how to do so.
No comments:
Post a Comment