Pinterest Stumbleupon Whatsapp
Ads by Google

You’ve probably heard about the latest scandal from the Facebook world: Facebook has been experimenting on users and playing with their emotions. Yes, really.

Facebook offers a free service to users, which to the learned on the Internet means that the users are the product. This latest development makes this all the more clear.

“Facebook data constitutes the largest field study in the history of the world.” — Adam Kramer, Facebook data scientist.

What Was The Experiment?

The full details of the experiment are noted in “Experimental evidence of massive-scale emotional contagion through social networks” on the PNAS site. It doesn’t take long to get the picture of what they did.

Basically, the data scientists at Facebook tweaked Facebook feeds to show only positive or negative posts for certain users, to see if that affected their mood. All they did was meddle with the algorithm, using data analysis of the posts to see if they were positive or negative. No actual posts were read by the scientists themselves, so the participants did not suffer any privacy breaches.

That’s not to say that the experiment was ethically sound. For instance, if a depressed person saw a week’s worth of negative posts in their feed, it could worsen their mental state and possibly send them over the edge. We may never even know if this happened.

Was I Experimented On By Facebook?

Were you a part of Facebook’s experiment? Would you even know? Probably not.

Ads by Google

It seems that participants weren’t consulted about participating in the experiment, and they weren’t notified after the fact either. So there’s no way you would know if it you had been a part of it, besides a nagging feeling that you had had a particularly good or bad week of posts in your Facebook feed one week in 2012.

It’s worth noting that Facebook would probably have limited their experiment to users of English, possibly just of the English (US), as this would have made the data analysis of the posts simpler. If you weren’t using English on Facebook during that time, then you probably weren’t in the experiment.

You Liked This

Facebook Like

Whether you actually want to be experimented on is not the point. Facebook already has your agreement in the matter, simply by your acceptance of its terms and conditions. The only way you can avoid being a part of Facebook’s next experiment completely is to avoid using Facebook.

What This Could Mean For The Future

Imagine opening up Facebook and deciding you only wanted to hear good news today: “Today’s menu: no more war, no more politics, and no disasters. Just kittens, wedding announcements, Upworthy and Buzzfeed, thanks Facebook.

That could be fantastic when you’ve had a hard day at work and just want a rest from all that hard news. It could also be an ideal way to filter the feeds of younger users – if you think young people need a bit of a buffer from bad news, that is.

“We also observed a withdrawal effect: People who were exposed to fewer emotional posts (of either valence) in their News Feed were less expressive overall on the following days, addressing the question about how emotional expression affects social engagement online. ” — Facebook Data Scientists.

It’s quite likely that Facebook would use the results of this Ecxperiment to drive users to post more (since engagement is what they’re all about). So, if you haven’t been posting much, Facebook may decide they want to tinker with your feed to make it more positive or negative in order to provoke you into posting something.

But this tactic also could lead to whitewashing our news feeds entirely. Considering the demands governments have placed on social networks in the past, it’s not too far-fetched to suggest that during times of civil unrest a government may ask Facebook to just “keep it happy” for all users. This way, people may find themselves happily browsing Facebook for hours instead of getting outraged and heading to the streets. I certainly think it’s a possibility.

It’s also within the realms of possibility that Facebook can automatically detect posts about politics. They could block political discourse from hitting your feed 3 Signs Your Political Viewpoints Are Completely Accurate [Opinion] 3 Signs Your Political Viewpoints Are Completely Accurate [Opinion] It's the holidays, and that means meeting with friends and family – and politics might come up. Are you wondering if your political viewpoints are correct? Read More , or worse, they could potentially manipulate it to make you vote a certain way. Not that most people need any help in this department, as we generally create our own information bubbles.

Filter Bubbles Are Damaging Society

If you know a bit about filter bubbles Distracted By Google Search? 4 "Search Engines" You Should Not Ignore Distracted By Google Search? 4 "Search Engines" You Should Not Ignore Google is not the only way to tackle the firehose of information. Internal search engines can be as potent in delivering the results you are hoping for. Here are four as alternative search tools. Read More , you can quickly see how having Facebook manipulate the filter bubble Eating Only Dessert: Why Your Information Diet Is Probably Terrible [Feature] Eating Only Dessert: Why Your Information Diet Is Probably Terrible [Feature] Email. Social networks. Blogs. Online video. People today consume more information than ever before, and typically only consume the things they really, really like. Clay Johnson compares this to a bad diet. "If you only... Read More deliberately for users could be a devastating thing for society.

The “filter bubble” term was coined by Eli Pariser, and it’s worth checking out what he said about them in his TED talk.

“The primary purpose of an editor is to extend the horizon of what people are interested in and what people know. Giving people what they think they want is easy, but it’s also not very satisfying: the same stuff, over and over again. Great editors are like great matchmakers: they introduce people to whole new ways of thinking, and they fall in love.” — Eli Pariser

Avoiding Facebook’s Psych Department

While you can’t completely opt out of Facebook’s psych experiments without deleting your Facebook account and avoiding Facebook, there are still things you can do to minimise any adverse effects caused by Facebook’s meddling – and generally to stop Facebook knowing so much about you. For starters, you can limit the pages you “like” and life events you add into the system, as this lets you control the sorts of Facebook advertising you are presented Stop The Spam: You Can Control The Facebook Ads You See [Weekly Facebook Tips] Stop The Spam: You Can Control The Facebook Ads You See [Weekly Facebook Tips] Are you seeing irrelevant ads from Facebook? Here's why Facebook is showing you these ads and what you can do to influence them. Read More with.

Then you can make sure Facebook doesn’t track you online How To Block Facebook And Other Social Networks From Tracking You Online How To Block Facebook And Other Social Networks From Tracking You Online Whenever you visit a site with a Like, Tweet or +1 button, you're actually sharing data with Facebook, Twitter or Google. And that's not all. There are hundreds of advertising and data collection companies that... Read More , by using extensions, avoiding using Facebook connect and making your browser stop using cookies. If you’re in the US, you can also opt out of data collecting How To Stop Facebook From Tracking Everything You Do [Facebook Weekly Tips] How To Stop Facebook From Tracking Everything You Do [Facebook Weekly Tips] Facebook has basically made a business out of knowing as much as they can possibly find out about everyone. So, tracking your behaviour online and offline makes perfect sense to them. However, it might not... Read More via third parties.

It also may be possible to avoid experiments involving data analysis on words just by avoiding the most likely language the scientists will use: English (US). Just change your language settings to English (UK) or English (Pirate) and you’ll be removed from their primary target language. That’s not to say they won’t extend their reach one day, but it might help in the short term.

Facebook-Change-Language

Lastly, to avoid experiments like the most recent one, you can stop letting Facebook filter your news feed Procrastinate On Facebook With These Great Ideas [Weekly Facebook Tips] Procrastinate On Facebook With These Great Ideas [Weekly Facebook Tips] Read More for you. Use friends lists to view your feed Using Facebook Friends Lists For Interests Or Circles [Facebook Hack Or Tip Of The Week] Using Facebook Friends Lists For Interests Or Circles [Facebook Hack Or Tip Of The Week] Is Facebook driving you crazy? Most people who use Facebook generally view the home feed in its unfiltered form, which means they're seeing updates from friends, acquaintances and pages all lumped in together and shown... Read More , or view your feed by “Most Recent” instead of “Top Posts” (bookmark the Most Recent link). There are also dozens of alternative Facebook clients and useful browser extensions that will help you get the feed you want.

Did You Like This Experiment?

Facebook’s experiment is outrageous, terrifying, dangerous and irresponsible. However, it’s also quite fascinating. What was your reaction to it? How would you feel if you knew you’d been experimented on?

  1. kayla
    July 5, 2014 at 2:16 am

    The best resource for creating blogs or videos for direct sales is Modulates. Any other opinions?

  2. Adolf
    July 3, 2014 at 3:36 pm

    I am offended to be part of a research experiment. Now I have been mentally affected and am ffek kespf eelg eg8j...

    • Angela A
      July 4, 2014 at 12:50 pm

      Hehe. :)

  3. Bob Myers
    July 3, 2014 at 1:00 pm

    One more reason I don't use "Social Networking" sites.

    • Angela A
      July 4, 2014 at 12:50 pm

      And fair enough, too!

    • Bob Myers
      July 4, 2014 at 4:24 pm

      Dear Ms. Angela,

      Thank you for the complement, which I believe and hope your reply was.

      According to an article in today's MUO, there is one lady who agrees with this colonial.

  4. Maryon Jeane
    July 3, 2014 at 9:24 am

    Excellent article and 'heads up' - it's for this reason that I read makeuseof (and plough through all the stuff about games to get to the meat...!). More articles like this, please.

  5. Antriksh Y
    July 3, 2014 at 9:23 am

    I have attended several tech talks by Facebook employees at my university. They do these tests all the time and are very open about it. This test got media attention somehow and got blown up.

    The engineers say that they have several thousand tiny variations of Facebook with slightly different behavior and functionality that they test on all their users. They make changes and see how people react to various features every week. That's how they develop their software. Sort-of similar methods of testing are also employed by Google.

    This is not new.

    • Angela A
      July 4, 2014 at 12:46 pm

      This is what I would have expected. I guess this one was a little more lab-rat and less to do with functionality. But either way, people hate them for it!

  6. Chris
    July 3, 2014 at 8:25 am

    I'm sorry, but I am offended. First of all they provide a service which is "Social Networking" not "Psychological Experiments on Users". Signing up for Facebook, whatever their T&C says, does not give them permission to experiment on people, that's the ethical part. That we are people and we have the right to know. Also, anyone or a group can still sue in this coumtry, doesn't matter what FB says is legal. I am outraged at this because it is a trust and decency issue and is like someone using your Webcam to spy on you which would be an invasion of privacy, to see how you live. Hey its an experiment. I am almost wanting to delete my FB account over this and hope there is enough media that FB may change their ways. Time will tell and if somethings don't change I will be off their system for good.

    • Angela A
      July 4, 2014 at 12:45 pm

      It's certainly beyond what most people expected, so they should have considered there might be people opposed to it.

      As for deleting the account, it might be wise. There is no doubt more to come.

  7. MrX
    July 3, 2014 at 8:22 am

    Maybe people should read the terms of use. It is easy to point fingers at Facebook, and it's understandable but we all need to take a good look at our self's. This is our fault as well. We all accepted the terms of use, probably without reading it.

    • Angela A
      July 4, 2014 at 12:43 pm

      True. Entirely true. I wonder what else most of us agreed to in Facebook's T&Cs?

  8. Koshy G
    July 3, 2014 at 7:10 am

    I don’t think facebook did anything wrong. I think the people who got offended by the test are stupid. Nobody is forcing anyone to use facebook. Stop complaining, if you don't like it just stop using it. And I found the test in itself was not too sinister but that is besides the point, they could have done something wildly inappropriate but they are within their right to do, as you have consented to it. As a private profit seeking company they should what ever that is in their interest, if it is legal. If anybody has a problem with that they can choose not use it and switch to another social network that doesn't do that. The experiment was clearly intune the their terms and conditions, Facebook said they were going to do it, Its your fault for not paying attention. You said it was ok for facebook to do it, But when they did actually did it you are outraged. If you don’t like their policy stop using it and make another social network with policies you like.

    People had an opportunity to stop using it before the the experiment. They could have never signed up in the first place. Saying nobody reads the T&C is no excuse for anything. You should have read it. Not reading it is your fault. If you killed somebody and say I don't read the law because nobody reads all the laws is stupid and does not does not justify the crime, It is your job to read it, if you don't read you have to accept the consequences. When you press the I have read T&C check box, you are supposed to read T&C and not skip it. The excuse nobody reads the T&C is stupid. Nobody is forcing you to sign up to series with long and boring T&C. If you then sign up for services by skipping the T&C you should expect the worst and accept that you could have sold them your soul.
    Facebook said they were going to experiment with you, agreed to it. Now when they did what you agreed to, you are getting angry. They couldn't be more clearer, you said you were ok with it. If you didn't like the policy you shouldn't have signed up. You can complain about the terms of the T&C and not signup but you cannot agree to it and then get angry when they did what they said they would do.

    It is our responsibility to read the T&C, self harm caused by not doing what you were supposed to do is your fault.

    I not here defending facebook or anything, all I am saying is that you have to own up and accept that what happened to you was your fault, you were the one who didn't read the T&C. If you felt the terms where unreasonable or too vague you shouldn't have accepted it. If you feel that what they did was against the terms and conditions you should sue. Terms are conditions are there for a reason. The Law and the courts are there for a reason

    • Angela A
      July 4, 2014 at 12:42 pm

      I think the main problem is that no-one reads terms and conditions properly and that not many people would infer from those terms and conditions that Facebook would potentially do this. We now know better.

      Nice rant, BTW. :)

  9. sneakily1
    July 3, 2014 at 2:34 am

    Facebook = Big Brother
    If you're still on there, it's all your fault... remember, you agreed to their terms when you signed up.

    • T
      July 3, 2014 at 3:29 am

      I was going to write a post but instead, ^Ditto what he said.

    • Angela A
      July 4, 2014 at 12:40 pm

      Isn't Facebook just one of the big brothers? Can't hide from them all.

  10. Jamief
    July 3, 2014 at 2:08 am

    Doesn't matter to me. It was just a Facebook news feed. It's not like they stoke my lunch money or punched one of my kids in the face. Google manipulates search results and news websites strategically select articles to present to you.

    • Angela A
      July 4, 2014 at 12:40 pm

      Quite true. I'd prefer Facebook manipulating which baby photos I see rather than Google deciding which "chinese restaurants" I'd like to find when I search.

  11. Christine S
    July 3, 2014 at 1:59 am

    As a researcher in this area I would like to be outraged, but more than anything I can't get past the irony that so many of us take to Facebook to complain about Facebook.

    Here is my interview with ABC News on The Ethics of Facebook's experiment.
    http://www.youtube.com/watch?v=mP73lZbkAug&list=UU9RI4x-z39AcEWiNKnyECpA&feature=share

    • Guy M
      July 3, 2014 at 11:45 am

      I would upvote your comment, but it doesn't seem to be working for me right now.

    • Angela A
      July 4, 2014 at 12:39 pm

      Too right. If we all complain about it on Facebook then the rest of the world won't hear. :)

  12. Guy M
    July 3, 2014 at 1:03 am

    How do I feel about the experiment? Exactly as my Silicon Valley and Washington D.C. overlords would have me feel.

    • Angela A
      July 4, 2014 at 12:38 pm

      Haha. Don't we all.

  13. Charlene F
    July 3, 2014 at 12:42 am

    Really helpful article. I was very upset that facebook did this experiment. Some great advice here. I will be using it! :)

    • Angela A
      July 4, 2014 at 12:38 pm

      Thanks - glad to help!

  14. Dave P
    July 2, 2014 at 10:46 pm

    I think this experiment was an outrageous stunt to pull without first getting permission from those involved. Their T&Cs might have ensured this was legal, but it certainly doesn't make it ethical.

    I don't think I even have enough friends on Facebook to have allowed this experiment to work on me. But it's very worrying that Facebook was able and willing to do this, as it suggests they could push things a lot further in the future.

    • Dann A
      July 3, 2014 at 6:03 am

      I totally agree, Dave. Even if we did agree to being part of research done by Facebook, I'm confident that most people, if they would have read that, would not at all have been expecting this sort of research. And while you can argue that that matters or doesn't matter, it seems like such a violation of our expectations and what was implied is indeed unethical.

      Also, most research projects done in university settings need to have an ethics application approved by an institutional review board, which serves to protect the interests of research participants. I seriously doubt Facebook took the time to fill out an ethics form and get it approved. Whether or not they did something highly unethical here, it has to be a worry that they will in the future.

    • Angela A
      July 4, 2014 at 12:37 pm

      The stupid thing is, if they'd just asked their target audience in the Facebook feed if they'd like to participate in "some sort of experiment" they could have gotten willing participants who still would have had no idea what the experiment was. They were crazy to pull this.

Leave a Reply

Your email address will not be published. Required fields are marked *