To make money, Facebook doesn’t just need users. It needs users that are active and engaged. It needs to know not just which link you’re likely to click, but also what makes you more or less likely to click it.
How does Facebook gather that kind of information?
By looking at your daily Facebook activity, for one. Analyzing the posts and pages that you like. And by running psychological experiments.
Where the Wild Things Are
Yes, Facebook has been conducting social experiments on its users. And yes, chances are you’ve involuntarily taken part at some point.
Is there a way to know for sure? Not really. But we’ve put together a list of Facebook experiments on users that are now known to the public. Have a look through these and see if anything rings a bell.
Number of people involved: 689,003
What happened: Facebook data scientists manipulated the news feeds of almost 690,000 users, showing some of them more positive updates and others more negative ones. All to see how it affected the users’ moods.
If there was a week in January 2012 when you were only seeing dead kittens or cute puppies in your feed, you might have been part of the study. The real mood swing, however, happened when the experiment became public.
The study was described by the public as “disturbing,” mainly because of the ethics of the experiment. After all, it involved hundreds of thousands of users unknowingly participating in a study that may have made them either happier or more depressed than usual.
What Facebook found out: Our emotions can indeed be affected by what we’re exposed to on Facebook.
Did Facebook violate your privacy? Many people say that it did. And we’re inclined to agree. Even if this type of manipulation can’t be classified as a privacy violation, it definitely seems unethical.
Number of people involved: 29 million
What happened: In this study, Facebook was trying to find out if the ads work better on you if your friends endorse them. They showed the users two different types of ads — with and without endorsements like “Peter Parker liked this” — and then measured how many clicks those got.
What Facebook found out: The stronger your bond with a friend is, the more likely you are to click the link.
Did Facebook violate your privacy? No. This is the kind of study that you’d expect Facebook conducting to improve their marketing strategies.
When: Summer 2012
Number of people involved: 20,000
What happened: Facebook researchers singled out status updates with requests in them, like “Can someone recommend a movie for tonight?” or “I need a ride to work tomorrow.” They were interested in those regularly asking for help rather than whether they actually got it.
What Facebook found out: Users who have a lot of friends on Facebook but visit the network less often are more likely to ask for help.
Did Facebook violate your privacy? No. The updates the researchers analyzed are public ones, hence, no invasion of privacy really.
When: Some time before 2012 (when it went public)
Number of people involved: 151 million
What happened: Facebook was trying to find out if your emotional state affects your friends. They looked at one million users’ status updates, both positive and negative, and then looked at the positivity or negativity of the posts of those users’ 150 million friends.
What Facebook found out: During the three days of running this study, the researchers found that the friends of the users with positive updates were suppressing their negative posts and vice versa. If you post something positive on Facebook, one out of every 100 friends (who wouldn’t have otherwise, according to the study) will do the same within 3 days.
Did Facebook violate your privacy? Could go either way. This study is believed to have led to the big emotion manipulation experiment mentioned earlier in the article.
When: July 2012
Number of people involved: Almost 4 million
What happened: Facebook tracked every entry of more than five characters that didn’t get posted within 10 minutes.
What Facebook found out: 71 percent of the users “self-censored,” drafting comments that they never posted. Many others edited their posts before sending them out to the social network.
Did Facebook violate your privacy? Probably. The fact that Facebook has a record of not just what you post, but also what you don’t post, is at the very least disturbing. For Chrome users, Data Selfie can help find out what other similar things Facebook might know about you.
When: Two months in 2012
Number of people involved: Over 1 million
What happened: The main purpose of this study was to find out whether broadcasting your intention to buy something will have an effect on your friends’ buying interests.
Facebook offered special deals, like free items, to certain users. If you accepted an offer, it would either be auto-shared so all your friends could see it or you’d be given a choice in the matter. The second group got a button they could click to choose whether they want their offers broadcasted.
What Facebook found out: More offers get claimed when everyone in your friends list gets to see them.
Did Facebook violate your privacy? Yes. Auto-sharing is invasive and frankly creepy. The study’s results show that only 23 percent of the users who had the choice decided to share it.
When: Summer and Fall 2010
Number of people involved: 253 million (half of all Facebook users at the time)
What happened: In order to find out how information spreads on Facebook, researchers randomly assigned 75 million URLs a “share” or “no-share” status. The links included anything from news articles to job offers. Those with the “no-share” status wouldn’t appear in your friends’ news feeds at all. Facebook wanted to know if the censored information would still find a way to the surface.
What Facebook found out: Big surprise: users are more likely to spread the information that they see their friends sharing. Also, according to the study, your distant friends are more likely to expose you to new information than your close friends.
Did Facebook violate your privacy? Definitely. Just imagine how much information Facebook deliberately censored by Facebook during this study. Hopefully it was nothing important. And the fact that they very closely tracked and monitored what you posted and how it affected your friends seems dubiously ethical as well.
When: U.S. midterm elections of 2010
Number of people involved: 61 million
What happened: In 2010, just before the midterm elections, Facebook researchers planted an “I voted” button at the top of the users’ news feeds, along with the information about their polling place. You could also see the names of your friends who had clicked the button. The researchers then checked public voting records to see which of the subjects actually voted.
What Facebook found out: Can Facebook encourage people to vote? It appears so. Users were more likely to click the “I Voted” button if they saw their friends’ names next to it. Researchers found that people who got the “I Voted” message in their News Feed were 0.39 percent more likely to have actually voted. Those seem like small percentages, but with the number of people involved in the experiment, that makes 340,000 possible votes that may not have otherwise happened.
Did Facebook violate your privacy? Maybe not, but it seems highly unethical. This one could have potentially caused an electoral swing if the button was displayed only to select groups. And none of the users realized that they were part of this experiment or that Facebook would look up their names in voting records.
Can Facebook get away with this? Yes. Facebook doesn’t need you to sign consent forms, as you’ve already agreed to the site’s data policy when you created your account.
A more important question is, what can you do to protect yourself and your privacy? This is something we’ve discussed before, and the options include paying attention to what you see in your feed, using alternative apps, and cutting your Facebook use. However, this debate is still very much open.
Have you taken part in a Facebook experiment? How would you choose to protect your online presence? Share your thoughts with us in the comments below!