Subscribe

Facebook is creepy power fiend

Nicola Mawson
By Nicola Mawson, Contributor.
Johannesburg, 30 Jun 2014
Facebook's manipulation of hundreds of thousands of news feeds was a bid to improve its service, it says.
Facebook's manipulation of hundreds of thousands of news feeds was a bid to improve its service, it says.

Facebook's "creepy" psychological manipulation of almost 700 000 users' news feeds has sparked outrage as it suggests underhanded dealings, and shows just how much power the popular social media network has over what people are feeling.

A can of worms was opened up this weekend when it was revealed Facebook users were the subject of research that manipulated what "news feeds" people were receiving for a week-long period in January 2012. The backlash was immediate, with angry people taking to Twitter to protest what many see as an ethical breach.

Liron Segev, Swift Consulting CEO, explains Facebook's "psychological manipulation" is possible because of the sheer volume of data it controls, as well as its ability to mine information. People must understand they have no control over what Facebook does with the information posted on virtual walls, he says.

During the one-week period, 689 003 people were subjected to feeds that were either happy, or sad, so that researchers could determine whether emotional states can be transferred to others without the recipients being aware. Apparently, the results are "some of the first experimental evidence to support the controversial claims that emotions can spread throughout a network," according to the research paper published in the Proceedings of the National Academy of Sciences.

No control

Segev says Facebook is so powerful it can influence our moods and predict what we are going to do next. However, he notes, this level of control is dangerous, and Facebook can influence more than whether users are happy, or sad, a power that could extend to subverting the outcome of elections, or manipulating share prices. "If you're already an emotional wreck and just need a little thing to push you over the edge, that would do it."

Facebook users are pawns on a chessboard for what is the equivalent of the world's third-largest continent; mere bits of data that it can do with as it pleases, says Segev. He notes the outrage was because people were unaware they were inadvertent subjects of covert manipulation, which is unlike the expected influence from other media sources that people can opt out of.

Despite the outrage, Segev does not expect people to leave Facebook in droves, because people will put up with the site's growing power. Facebook has an army of lawyers, and complicated user policies that are constantly being updated, making any legal fallout unlikely.

Twitter backlash

Facebook users took to Twitter to complain about its experiment. Here is some of what was posted:
?@MarkCTN: "Facebook emotion experiment sparks criticism http://bbc.in/1nWz66l ...the real big brother... selling our info."
?@edyong209: "Can't wait to read Facebook's PNAS paper about its cunning social contagion experiment to fill Twitter with angry messages about Facebook."
?@kobusehlers: "I can't imagine Facebook's 'Mood Experiment' would get past the ethics board at any decent university. #scarystuff."
?@LOLGOP: "Very upset to learn Facebook has been conducting an experiment to see how many Candy Crush Saga requests I can tolerate from my aunt."
?@dangillmor: "The researchers in the Facebook emotion-manipulation experiment should be ashamed of themselves."
?@sarahcuda: "Facebook's science experiment on users shows the company is even more powerful and unethical than we thought."
?@katecrawford: "Perhaps what bothers me most about the Facebook experiment: it's just one glimpse into an industry-wide game. We are A/B testing the world."

Social media lawyer Paul Jacobson adds its terms of use are broad enough to cover its use of data in this fashion. Facebook itself says: "We... put together data from the information we already have about you, your friends, and others, so we can offer and suggest a variety of services and features. For example, we may make friend suggestions, pick stories for your News Feed, or suggest people to tag in photos."

Users also give up the use of information for "data analysis, testing, [and] research".

We did it for you

Facebook shrugged off the criticism in a response sent to several media outlets, saying research is needed to improve its services, and to "make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends, or information from pages they follow."

Adam Kramer, who co-authored the paper, writes on his Facebook wall that it was "important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out".

The researchers were also worried that negative posts "might lead people to avoid visiting Facebook", says Kramer, arguing the study only affected around 0.14% of Facebook's more than a billion users. He adds posts were not hidden, they just simply did not show up on some news feed loads.

What was so creepy about Facebook's research was the level of sophistication and reach involved, says Jacobson, noting this sort of manipulation - which is overtly emotional - could well be a growing trend.

Share