It’s all over the news, and it’s not looking good for Facebook and the researchers behind a newly published study. In case you haven’t heard, Facebook has teamed up with a group of researchers and manipulated the emotions of almost 700,000 of you during a week in January 2012 (http://www.pnas.org/content/111/24/8788.full). The paper describes the occurrence of ‘emotional contagion’ where people exposed to more positive posts on their news feeds post more positive things themselves and vice versa for sad posts. However this shouldn’t come as a surprise. I’m not saying the results are unsurprising, I’m saying that Facebook has manipulated your newsfeed to get an emotional response, and you shouldn’t be surprised.
People are outraged at the fact that Facebook have played with our emotions and made us feel, and therefore act certain ways, some even equating it to brainwashing. But Facebook does this ALL THE TIME. Constantly your newsfeed is tailored to the things that you find most interesting, the people you interact with most and more importantly the ads and pages you’re most likely to click on based on a plethora of data they have collected about you. You are always being manipulated by Facebook, only instead of it usually being so they can make money, this time it’s for scientific research, and this time everyone is outraged! I’m not saying any of what Facebook are doing is right but it isn’t new and if I’m going to be exposed to tailored posts for anything, I’d rather it be for scientific research then for them to make money out of me.
Psychological research is a little out of my comfort zone to speak about as I only took one module in psychology back during my undergrad degree. I have read the original paper however, something that is essential to do when stories like these hit the public media. Quite simply, we’ve always known that when people we know are sad it can make everyone around them have a bit of a downer and when our best friend is happy, we’re happy. What this research has achieved is showing that we don’t have to be in the presence of someone who is sad/happy to feel sad/happy. We don’t even need to hear them say anything about their emotional wellbeing; we just need to read it. Non-verbal communication of emotion is enough for emotional contagion to occur.
“These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. This work also suggests that, in contrast to prevailing assumptions, in-person interaction and nonverbal cues are not strictly necessary for emotional contagion, and that the observation of others’ positive experiences constitutes a positive experience for people.”
The people most scared about this research have gone as far as to call it ‘thought-control’. Needless to say those who have publicly voiced this concern are politicians who fear the idea could be used in the build up to elections to sway voters opinions. I can’t help but be cynical and say that given the opportunity, they would team up with Facebook for their advantage anyway. One point I must stress is that the difference between the control and experimental groups’ behaviours was tiny. It was statistically significant but when positivity was reduced, the percentage of positive words used by users only decreased by 0.1% and the percentage of negative words used increased by 0.04%. Is that really enough to change the political opinion of someone?!
One lesson to be learnt from all this is that Facebook hasn’t broken any rules, and agreement to taking part in such research is in the data use policy that we all agreed to (but who really read that?!). Does this count as informed consent? Facebook says yes, I’d probably disagree, along with several other academics, but the situation is hazy. Ethically, the study has a lot to make up for. Surely they should have realised there would be a public outcry and surely that’s enough of a reason to re-think. All they would have needed to do would send an annoying pop up notification to users asking for their consent and got us to click a new box. Based on the number of candy crush notifications I get weekly this really wouldn’t have been an inconvenience on their part!
What this research has done is not waterproof but it’s not unexpected or illegal. If you are on an anti-Facebook rampage, delete your account. But remember, while you are there please delete you accounts/stop your interactions with Twitter, Amazon, Google, Netflix, Youtube and many other big players…
(Image belongs to Facebook)