Social Media

Facebook’s Fascinating (and Disturbing) History of Secret Experiments

Anya Zhukova 27-04-2017

To make money, Facebook doesn’t just need users. It needs users that are active and engaged. It needs to know not just which link you’re likely to click, but also what makes you more or less likely to click it.


How does Facebook gather that kind of information?

By looking at your daily Facebook activity, for one. Analyzing the posts and pages that you like. And by running psychological experiments.

Wait, what?

Where the Wild Things Are

experiment lab
Image Credit: Shaiith via Shutterstock

Yes, Facebook has been conducting social experiments on its users. And yes, chances are you’ve involuntarily taken part at some point.


Is there a way to know for sure? Not really. But we’ve put together a list of Facebook experiments on users that are now known to the public. Have a look through these and see if anything rings a bell.

1. Massive-Scale Emotional Contagion

When: 2012

Number of people involved: 689,003

What happened: Facebook data scientists manipulated the news feeds of almost 690,000 users, showing some of them more positive updates and others more negative ones. All to see how it affected the users’ moods.


If there was a week in January 2012 when you were only seeing dead kittens or cute puppies in your feed, you might have been part of the study. The real mood swing, however, happened when the experiment became public Facebook Experiments On Users, Having Sex With Your iPad, And More... [Tech News Digest] Also, The Internet's Own Boy is worth watching, BlackBerry invites us to check our facts, Photoshop Express lands on Windows Phone, binge-watching is bad, and Apple paints the iPhone as an extra parent. Read More .

The study was described by the public as “disturbing,” mainly because of the ethics of the experiment. After all, it involved hundreds of thousands of users unknowingly participating in a study that may have made them either happier or more depressed than usual.

What Facebook found out: Our emotions can indeed be affected by what we’re exposed to on Facebook.

Did Facebook violate your privacy? Many people say that it did. And we’re inclined to agree. Even if this type of manipulation can’t be classified as a privacy violation, it definitely seems unethical.


2. Social Influence in Social Advertising

facebook laptop
Image Credit: Chinnapong via Shutterstock

When: 2011

Number of people involved: 29 million

What happened: In this study, Facebook was trying to find out if the ads work better on you if your friends endorse them. They showed the users two different types of ads — with and without endorsements like “Peter Parker liked this” — and then measured how many clicks those got.


What Facebook found out: The stronger your bond with a friend is, the more likely you are to click the link.

Did Facebook violate your privacy? No. This is the kind of study that you’d expect Facebook conducting to improve their marketing strategies Stop The Spam: You Can Control The Facebook Ads You See [Weekly Facebook Tips] Are you seeing irrelevant ads from Facebook? Here's why Facebook is showing you these ads and what you can do to influence them. Read More .

3. Exploring Requests for Help on Facebook

When: Summer 2012

Number of people involved: 20,000

What happened: Facebook researchers singled out status updates with requests in them, like “Can someone recommend a movie for tonight?” or “I need a ride to work tomorrow.” They were interested in those regularly asking for help rather than whether they actually got it.

What Facebook found out: Users who have a lot of friends on Facebook but visit the network less often are more likely to ask for help.

Did Facebook violate your privacy? No. The updates the researchers analyzed are public ones, hence, no invasion of privacy really.

4. The Spread of Emotion via Facebook

When: Some time before 2012 (when it went public)

Number of people involved: 151 million

What happened: Facebook was trying to find out if your emotional state affects your friends. They looked at one million users’ status updates, both positive and negative, and then looked at the positivity or negativity of the posts of those users’ 150 million friends.

What Facebook found out: During the three days of running this study, the researchers found that the friends of the users with positive updates were suppressing their negative posts and vice versa. If you post something positive on Facebook, one out of every 100 friends (who wouldn’t have otherwise, according to the study) will do the same within 3 days.

Did Facebook violate your privacy? Could go either way. This study is believed to have led to the big emotion manipulation experiment mentioned earlier in the article.

5. Self-Censorship on Facebook

Image Credit: Igorstevanovic via Shutterstock

When: July 2012

Number of people involved: Almost 4 million

What happened: Facebook tracked every entry of more than five characters that didn’t get posted within 10 minutes.

What Facebook found out: 71 percent of the users “self-censored,” drafting comments that they never posted. Many others edited their posts before sending them out to the social network.

Did Facebook violate your privacy? Probably. The fact that Facebook has a record of not just what you post, but also what you don’t post, is at the very least disturbing. For Chrome users, Data Selfie Data Selfie: How Much Does Facebook Actually Know About You? You share a huge amount of data with Facebook, even if you don't click or post very much. This Chrome extension will give you an idea of just how much Facebook knows about you. Read More  can help find out what other similar things Facebook might know about you.

6. Selection Effects in Online Sharing

When: Two months in 2012

Number of people involved: Over 1 million

What happened: The main purpose of this study was to find out whether broadcasting your intention to buy something will have an effect on your friends’ buying interests.

Facebook offered special deals, like free items, to certain users. If you accepted an offer, it would either be auto-shared so all your friends could see it or you’d be given a choice in the matter. The second group got a button they could click to choose whether they want their offers broadcasted.

What Facebook found out: More offers get claimed when everyone in your friends list gets to see them.

Did Facebook violate your privacy? Yes. Auto-sharing is invasive and frankly creepy. The study’s results show that only 23 percent of the users who had the choice decided to share it.

7. The Role of Social Networks in Information Diffusion

When: Summer and Fall 2010

Number of people involved: 253 million (half of all Facebook users at the time)

What happened: In order to find out how information spreads on Facebook, researchers randomly assigned 75 million URLs a “share” or “no-share” status. The links included anything from news articles to job offers. Those with the “no-share” status wouldn’t appear in your friends’ news feeds at all. Facebook wanted to know if the censored information would still find a way to the surface.

What Facebook found out: Big surprise: users are more likely to spread the information that they see their friends sharing. Also, according to the study, your distant friends are more likely to expose you to new information than your close friends.

Did Facebook violate your privacy? Definitely. Just imagine how much information Facebook deliberately censored by Facebook during this study. Hopefully it was nothing important. And the fact that they very closely tracked and monitored what you posted and how it affected your friends seems dubiously ethical as well.

8. Social Influence and Political Mobilization

i voted sticker
Image Credit: Icemanj via Shutterstock

When: U.S. midterm elections of 2010

Number of people involved: 61 million

What happened: In 2010, just before the midterm elections, Facebook researchers planted an “I voted” button at the top of the users’ news feeds, along with the information about their polling place. You could also see the names of your friends who had clicked the button. The researchers then checked public voting records to see which of the subjects actually voted.

What Facebook found out: Can Facebook encourage people to vote? It appears so. Users were more likely to click the “I Voted” button if they saw their friends’ names next to it. Researchers found that people who got the “I Voted” message in their News Feed were 0.39 percent more likely to have actually voted. Those seem like small percentages, but with the number of people involved in the experiment, that makes 340,000 possible votes that may not have otherwise happened.

Did Facebook violate your privacy? Maybe not, but it seems highly unethical. This one could have potentially caused an electoral swing if the button was displayed only to select groups. And none of the users realized that they were part of this experiment or that Facebook would look up their names in voting records.

Important Conclusions

security privacy
Image Credit: Den Rise via Shutterstock

Can Facebook get away with this? Yes. Facebook doesn’t need you to sign consent forms, as you’ve already agreed to the site’s data policy when you created your account.

A more important question is, what can you do to protect yourself and your privacy Keep Your Facebook Chats Secure With Encryption Facebook wants to take your personal messages and use them as a way to target you with more advertising. Let's try to stop them. Read More ? This is something we’ve discussed before Don’t Be an Experiment: How to Control Your Facebook News Feed What determines what is and isn't a top Facebook post? How often you interact with a person, topics you're interested in, how many likes or comments the post attracts, and – apparently – psychological experiments. Read More , and the options include paying attention to what you see in your feed, using alternative apps, and cutting your Facebook use. However, this debate is still very much open.

Have you taken part in a Facebook experiment? How would you choose to protect your online presence? Share your thoughts with us in the comments below!

Related topics: Facebook, Online Privacy.

Affiliate Disclosure: By buying the products we recommend, you help keep the site alive. Read more.

Whatsapp Pinterest

Leave a Reply

Your email address will not be published. Required fields are marked *

  1. Sarah
    August 24, 2019 at 12:12 am

    I have repeatedly had fb unfriend me from certain people, who have actually gotten mad at me for unfriending them when I did no such thing. A small poll of my friends has found that they have experienced some of the same things. This has bee going on for years. Can someone look into this?

  2. Trutherator
    May 9, 2018 at 10:33 am

    Duh. Still debating what to do?

    I deleted myself entirely out of Facebook. It got to be a time sink. For the sake of mutual family, I'd have to rebut posts from my fanatically ultra-leftist-feminist Berniebot sister.

    You know, traumatized because a former male co-worker had punished her for rejecting his advances, she switched parties to Dem because the R's were mean to Judge Thomas accuser. But wait, in Orwellian fashion, Bill Clinton's rapes & Hillary's aggressive cover-up operation were nothing to consider.

  3. Philip Bates
    April 30, 2017 at 12:20 pm

    Fascinating article, Anya. And yes, pretty creepy. But that's what you get when you put so much of your life on a free social network.

    Re. the emotional manipulation tests: while those studies are interesting, the conclusions, I think, are fairly obvious, and I can't really understand why FB would risk their reputation (arguably, anyway) to carry them out. It's only natural that looking at depressing things a lot will make you depressed, right? I think the study in #4 especially isn't extensive enough to actually form proper conclusions anyway - you'd need to factor in what other aspects could've made people post positive updates, and let's face it, few, if any, really have the power to conduct that thorough an experiment.

    Your important conclusions - bang on the money. Makes you wonder what FB could do to put people off... Great piece.