unnamed

Facebook is facing a surge of protest after admitting that they modified users’ account to manipulate emotions in a disturbing experiment.

Researchers from Cornell University and the University of California modified hundreds of users’ accounts by adding or deleting ‘positive emotional content’ to see if it could make them happier or sadder – without asking permission and telling them otherwise.

They filtered information going into the ‘news feeds’ – the constant flow of links, videos, pictures, and comments by friends – of 689,000 users. When ‘positive emotional content’ from friends was reduced, users would post more negative content themselves, essentially becoming depressed. The opposite happened when ‘negative emotional content’ was reduced.

The aforementioned process has been labeled ’emotional contagion’.

Moreover, the study, published earlier this month in the journal Proceedings of the National Academy of Sciences of the USA, concluded: ‘Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.’

Facebook confirmed it had commissioned the study, but played down its significance. The chilling insight into the power of social media – which already has access to a wealth of users’ private information – has worried politicians and internet activists alike.

On the other hand, Labour MP Jim Sheridan, a member of the Commons media select committee, said in an interview with the Guardian, “This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people.”

“They are manipulating material from people’s personal lives and I am worried about the ability of Facebook and others to manipulate people’s thoughts in politics or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it,” he added.

In contrary to this, spokesman of Facebook said the research was conducted over a single week and none of the data was associated with a specific person’s account. Instead, they said the site wanted to make its content more ‘relevant and engaging’.

“A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow,” the spokesman added.

Kathryn A. Orbigozo
ABCom Intern

Facebook Comments