Facebook’s secret experiment should surprise no-one


Facebook is facing widespread criticism for conducting a psychology experiment on 700,000 of its users without their knowledge, but the reaction is unwarranted.

The US-based social network manipulated what was seen by users in their newsfeeds in order to control the emotional expressions they were exposed to on the site in order to determine whether “exposure to emotions led people to change their own posting behaviours”.

The company worked with Cornell University and the University of California at San Francisco on the study.

There have been calls for the social network to apologise and that the study broke some ethical guidelines, but what does not seem to be discussed in the numerous pieces of outrage is that manipulating the newsfeed to get people to use the site more is exactly what Facebook has always done.

Facebook uses a variety of signals to determine what you see in your newsfeed including which pages you’ve “liked”, which posts you have interacted with (clicked) in the past, and which posts you have shared to try and show you posts that will most likely get you to interact and share again. The more people that share posts, the more people that are using Facebook – and that is exactly what Facebook wants.

So the company decided to try manipulating people’s emotions to see if that would have an effect on how they use Facebook – so what? Viral content sites like Upworthy have been developed with the single focus of publishing posts which manipulate the user into action and sharing, whether that is through positivity or outrage, and newspaper headlines have been doing this for decades – so why are we so outraged that Facebook has done the same?

Generally, if a researcher is looking for people to take part in an experiment then they look for “informed consent” whether the participant has agreed to be part of the study before it starts. This is done to protect participants and keep scientists from pushing boundaries too far. However, what the comments of outrage seem to miss is that by signing up for Facebook, we have already agreed to being part of this type of manipulation through our newsfeed. By using the service we have agreed to Facebook creating an algorithm to determine what we see in our newsfeeds, and however they decide to manipulate the feed, and thinking they won’t do this to manipulate our responses is naive.

Photograph by Descrier Images

Share This