Google

Saturday, July 05, 2014

We shouldn't expect Facebook to behave ethically | Technology | The Observer

We shouldn't expect Facebook to behave ethically | Technology | The Observer: "In case you missed it, here's the gist of the story. The first thing users of Facebook see when they log in is their news feed, a list of status updates, messages and photographs posted by friends. The list that is displayed to each individual user is not comprehensive (it doesn't include all the possibly relevant information from all of that person's friends). But nor is it random: Facebook's proprietary algorithms choose which items to display in a process that is sometimes called "curation". Nobody knows the criteria used by the algorithms – that's as much of a trade secret as those used by Google's page-ranking algorithm. All we know is that an algorithm decides what Facebook users see in their news feeds.

So far so obvious. What triggered the controversy was the discovery, via the publication of a research paper in the prestigious Proceedings of the National Academy of Sciences that for one week in January 2012, Facebook researchers deliberately skewed what 689,003 Facebook users saw when they logged in. Some people saw content with a preponderance of positive and happy words, while others were shown content with more negative or sadder sentiments. The study showed that, when the experimental week was over, the unwitting guinea-pigs were more likely to post status updates and messages that were (respectively) positive or negative in tone.

Statistically, the effect on users was relatively small, but the implications were obvious: Facebook had shown that it could manipulate people's emotions! And at this point the ordure hit the fan. Shock! Horror! Words such as "spooky" and "terrifying" were bandied about. There were arguments about whether the experiment was unethical and/or illegal, in the sense of violating the terms and conditions that Facebook's hapless users have to accept. The answers, respectively, are yes and no because corporations don't do ethics and Facebook's T&Cs require users to accept that their data may be used for "data analysis, testing, research"."



'via Blog this'

No comments: