You’ve probably heard about the latest scandal from the Facebook world: Facebook has been experimenting on users and playing with their emotions. Yes, really.
Facebook offers a free service to users, which to the learned on the Internet means that the users are the product. This latest development makes this all the more clear.
“Facebook data constitutes the largest field study in the history of the world.” — Adam Kramer, Facebook data scientist.
What Was The Experiment?
The full details of the experiment are noted in “Experimental evidence of massive-scale emotional contagion through social networks” on the PNAS site. It doesn’t take long to get the picture of what they did.
Basically, the data scientists at Facebook tweaked Facebook feeds to show only positive or negative posts for certain users, to see if that affected their mood. All they did was meddle with the algorithm, using data analysis of the posts to see if they were positive or negative. No actual posts were read by the scientists themselves, so the participants did not suffer any privacy breaches.
That’s not to say that the experiment was ethically sound. For instance, if a depressed person saw a week’s worth of negative posts in their feed, it could worsen their mental state and possibly send them over the edge. We may never even know if this happened.
Was I Experimented On By Facebook?
Were you a part of Facebook’s experiment? Would you even know? Probably not.
It seems that participants weren’t consulted about participating in the experiment, and they weren’t notified after the fact either. So there’s no way you would know if it you had been a part of it, besides a nagging feeling that you had had a particularly good or bad week of posts in your Facebook feed one week in 2012.
It’s worth noting that Facebook would probably have limited their experiment to users of English, possibly just of the English (US), as this would have made the data analysis of the posts simpler. If you weren’t using English on Facebook during that time, then you probably weren’t in the experiment.
You Liked This
Whether you actually want to be experimented on is not the point. Facebook already has your agreement in the matter, simply by your acceptance of its terms and conditions. The only way you can avoid being a part of Facebook’s next experiment completely is to avoid using Facebook.
What This Could Mean For The Future
Imagine opening up Facebook and deciding you only wanted to hear good news today: “Today’s menu: no more war, no more politics, and no disasters. Just kittens, wedding announcements, Upworthy and Buzzfeed, thanks Facebook.”
That could be fantastic when you’ve had a hard day at work and just want a rest from all that hard news. It could also be an ideal way to filter the feeds of younger users – if you think young people need a bit of a buffer from bad news, that is.
“We also observed a withdrawal effect: People who were exposed to fewer emotional posts (of either valence) in their News Feed were less expressive overall on the following days, addressing the question about how emotional expression affects social engagement online. ” — Facebook Data Scientists.
It’s quite likely that Facebook would use the results of this Ecxperiment to drive users to post more (since engagement is what they’re all about). So, if you haven’t been posting much, Facebook may decide they want to tinker with your feed to make it more positive or negative in order to provoke you into posting something.
But this tactic also could lead to whitewashing our news feeds entirely. Considering the demands governments have placed on social networks in the past, it’s not too far-fetched to suggest that during times of civil unrest a government may ask Facebook to just “keep it happy” for all users. This way, people may find themselves happily browsing Facebook for hours instead of getting outraged and heading to the streets. I certainly think it’s a possibility.
It’s also within the realms of possibility that Facebook can automatically detect posts about politics. They could block political discourse from hitting your feed , or worse, they could potentially manipulate it to make you vote a certain way. Not that most people need any help in this department, as we generally create our own information bubbles.
Filter Bubbles Are Damaging Society
The “filter bubble” term was coined by Eli Pariser, and it’s worth checking out what he said about them in his TED talk.
“The primary purpose of an editor is to extend the horizon of what people are interested in and what people know. Giving people what they think they want is easy, but it’s also not very satisfying: the same stuff, over and over again. Great editors are like great matchmakers: they introduce people to whole new ways of thinking, and they fall in love.” — Eli Pariser
Avoiding Facebook’s Psych Department
While you can’t completely opt out of Facebook’s psych experiments without deleting your Facebook account and avoiding Facebook, there are still things you can do to minimise any adverse effects caused by Facebook’s meddling – and generally to stop Facebook knowing so much about you. For starters, you can limit the pages you “like” and life events you add into the system, as this lets you control the sorts of Facebook advertising you are presented with.
Then you can make sure Facebook doesn’t track you online , by using extensions, avoiding using Facebook connect and making your browser stop using cookies. If you’re in the US, you can also opt out of data collecting via third parties.
It also may be possible to avoid experiments involving data analysis on words just by avoiding the most likely language the scientists will use: English (US). Just change your language settings to English (UK) or English (Pirate) and you’ll be removed from their primary target language. That’s not to say they won’t extend their reach one day, but it might help in the short term.
Lastly, to avoid experiments like the most recent one, you can stop letting Facebook filter your news feed for you. Use friends lists to view your feed , or view your feed by “Most Recent” instead of “Top Posts” (bookmark the Most Recent link). There are also dozens of alternative Facebook clients and useful browser extensions that will help you get the feed you want.
Did You Like This Experiment?
Facebook’s experiment is outrageous, terrifying, dangerous and irresponsible. However, it’s also quite fascinating. What was your reaction to it? How would you feel if you knew you’d been experimented on?