Facebook is creepy power fiend
Facebook's "creepy" psychological manipulation of almost 700 000 users' news feeds has sparked outrage as it suggests underhanded dealings, and shows just how much power the popular social media network has over what people are feeling.
A can of worms was opened up this weekend when it was revealed Facebook users were the subject of research that manipulated what "news feeds" people were receiving for a week-long period in January 2012. The backlash was immediate, with angry people taking to Twitter to protest what many see as an ethical breach.
Liron Segev, Swift Consulting CEO, explains Facebook's "psychological manipulation" is possible because of the sheer volume of data it controls, as well as its ability to mine information. People must understand they have no control over what Facebook does with the information posted on virtual walls, he says.
During the one-week period, 689 003 people were subjected to feeds that were either happy, or sad, so that researchers could determine whether emotional states can be transferred to others without the recipients being aware. Apparently, the results are "some of the first experimental evidence to support the controversial claims that emotions can spread throughout a network," according to the research paper published in the Proceedings of the National Academy of Sciences.
Segev says Facebook is so powerful it can influence our moods and predict what we are going to do next. However, he notes, this level of control is dangerous, and Facebook can influence more than whether users are happy, or sad, a power that could extend to subverting the outcome of elections, or manipulating share prices. "If you're already an emotional wreck and just need a little thing to push you over the edge, that would do it."
Facebook users are pawns on a chessboard for what is the equivalent of the world's third-largest continent; mere bits of data that it can do with as it pleases, says Segev. He notes the outrage was because people were unaware they were inadvertent subjects of covert manipulation, which is unlike the expected influence from other media sources that people can opt out of.
Despite the outrage, Segev does not expect people to leave Facebook in droves, because people will put up with the site's growing power. Facebook has an army of lawyers, and complicated user policies that are constantly being updated, making any legal fallout unlikely.
Users also give up the use of information for "data analysis, testing, [and] research".
We did it for you
Facebook shrugged off the criticism in a response sent to several media outlets, saying research is needed to improve its services, and to "make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow."
Adam Kramer, who co-authored the paper, writes on his Facebook wall that it was "important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out".
The researchers were also worried that negative posts "might lead people to avoid visiting Facebook", says Kramer, arguing the study only affected around 0.14% of Facebook's more than a billion users. He adds posts were not hidden, they just simply did not show up on some news feed loads.
What was so creepy about Facebook's research was the level of sophistication and reach involved, says Jacobson, noting this sort of manipulation - which is overtly emotional - could well be a growing trend.