How Can Social Networks Get Rid of Misinformation and Harassment?
Pinterest Stumbleupon Whatsapp
Advertisement

Since the 2016 US presidential election, mainstream media has looked closely at the role of social media in politics. Fake news stories were rampant on Facebook around election time. Members of minority groups faced serious harassment on Twitter throughout the year, and the problem is still getting worse. And while we generally consider the internet to be a bastion of free speech Free Speech vs. Harassment: Why Did Reddit Ban Five Subreddits? Free Speech vs. Harassment: Why Did Reddit Ban Five Subreddits? Reddit officially announced the decision to ban five subreddits, due to ongoing harassment problems. Read More , the people running those social network services decided that it was time to take action.

In the wake of the election, both platforms have taken some controversial steps in an effort to better monitor and curate content. What have they done? Will they stick by their new standards? And will it help make the internet a better place?

I decided to ask some people who have done a lot of thinking about these sorts of issues to see what they think.

What Are Facebook and Twitter Doing?

During the election cycle, users circulated a lot of fake news stories on Facebook. Post-election, commentators have been discussing whether those stories may have influenced the results. Of course, there are people vehemently arguing on both sides. But the result of this battle is that Facebook has committed to identifying fake news posts and taking action on them.

We haven’t yet heard the details of what Facebook is planning to do. It’s probable that a new algorithm will take a number of factors into account, including user reports and information on fact-checking sites Is It Really True? The 5 Best Fact-Checking Websites Is It Really True? The 5 Best Fact-Checking Websites Fact checking has its origin in the early 20th century, when magazines began to verify statements made in non-fictional texts prior to publication. This practice increases credibility and trustworthiness of articles and documents. Today, fact... Read More . We don’t yet know how it will deal with degrees of truth, bias, sensationalism, and other difficult questions. Some conservative news sites are nervous that Facebook will censor their views, at least in part because Facebook has a left-leaning history. It’s a complicated undertaking, and minimizing bias is extremely difficult.

Twitter, on the other hand, has recently been banning accounts that it says violate its terms of use. The platform’s user agreement prohibits harassment and hateful conduct, but Twitter doesn’t have a great reputation Tweeting While Female: Harassment, and How Twitter Can Fix It Tweeting While Female: Harassment, and How Twitter Can Fix It Twitter's abuse problem is real. Here are some examples, along with expert opinion on how Twitter can solve this. Read More when it comes to actually preventing harassment. They’ve recently gone on a banning spree, getting rid of accounts that belong to prominent members of the “alt-right” movement, including Richard Spencer and the organizations he runs, Pax Dickinson, Ricky Vaughn, and John Rivers.

After announcing that they’d provide better counter-abuse tools and take complaints more seriously, Twitter seems to be stepping up its game. But the platform has focused its ire on accounts related to a specific political movement. Because of this, many people think it looks like a political move, and not a counter-abuse one. It’s tough to disentangle motivations here.

Is This New?

When I asked Aaron Smith, Associate Director of Research at the Pew Research Center, about these developments, he emphasized that these problems have a long history: “[T]he need to police ‘negative’ content of various kinds (whether that’s fake news, abuse, trolling, spam, or what have you) is something that online platforms and their owners/moderators have struggled with since the dawn of the internet.”

online harassment
Image Credit: Photographee.eu via Shutterstock

While the media has been giving this issue a lot of attention lately, he told me, it’s been around in various forms forever. He also pointed me to a great article from 2011 called “If Your Website’s Full of Assholes, It’s Your Fault” that sums up a lot of the issues prevalent in the discussion.

Just because this has been going on forever doesn’t mean that we haven’t gotten better at dealing with it. Twitter, for example, was very hands-off for a long time. But it recently banned over 100,000 accounts related to ISIS. And the number of people that the service has banned for harassment Online Harassment Is Your Fault; Here's How To Fix It Online Harassment Is Your Fault; Here's How To Fix It What happens to harassment victims who aren't good at interacting with the press, or do or say unpopular things? Read More does seem to have risen before this recent run. Some people, though, argue that the platform has used its powers to disproportionately get rid of right-leaning tweeters.

Facebook’s previous crackdowns have copyright violations, fake-name accounts, marijuana dispensary pages, and “overly promotional” posts. Hate-speech groups have previously faced consequences, but from law enforcement, not from Facebook. And Zuckerberg’s platform seems content to maintain a more open atmosphere.

One of Twitter’s most notable bans, for example, was of alt-right commentator Milo Yiannopoulos. Twitter determined that his views on feminism, Islam, and other issues incited “the targeted abuse or harassment of others.” He still has a page on Facebook, as does Richard Spencer’s National Policy Institute. Despite many people viewing these two personalities as hateful, Facebook is happy to continue hosting them.

Why Now?

It seems likely that both platforms have chosen to take these actions as a response to the recent election. Why they didn’t feel obligated to do it before, however, is less clear. To be fair, Facebook has taken some action in the past. For example, after damning revelations that editors artificially influenced political news Facebook Denies Censoring Trends, Search Through the Panama Papers... [Tech News Digest] Facebook Denies Censoring Trends, Search Through the Panama Papers... [Tech News Digest] Facebook denies wrongdoing over Trending Topics, the Panama Papers are now online, Apple Music helps Spotify succeed, help Donald Trump build his wall, and every Top 10 list video ever recorded. Read More  to suppress right-wing viewpoints emerged in 2016, they moved to an algorithmic trending section. But that allowed fake news to propagate even further.

Buzzfeed’s Craig Silverman found that engagement with fake news stories actually surpassed that of real stories near the US presidential election. Could those stories have influenced election results? It certainly seems possible. In the wake of an election that has much of the mainstream media contemplating its behavior and future, Facebook appears to be thinking carefully about its responsibilities. (And, if you’re cynical, probably its political motivations.)

Twitter has always had a tumultuous relationship with its own championing of free speech. But I think it finally got fed up with people saying that it’s a toxic environment. The platform sees a huge huge amount of harassment, and people have been talking about it for a long time.

Many users have left Twitter What Happens When You Quit Social Media? I Found Out What Happens When You Quit Social Media? I Found Out Quitting social media is easy. The hard part is handling what comes after this "extreme" step. I should know. I deleted all my social media accounts mid 2013. Read More as a result of the harassment they’ve received. And reports of harassment have only increased since the election. That, combined with reported revenue problems, likely has Twitter worried. Taking action to clean up the network could help increase its user base. And, consequently, revenue.

Of course, there’s always the possibility that they just decided one day that they want to be upstanding citizens of the internet.

Not Just Fake News

As soon as I heard about these crackdowns, I started wondering. Will they work? Will these new policies make the internet a better place? Could they help people get better information online and reduce harassment? I thought I’d ask some people who have given the topic a lot of thought.

First, I got in touch with Rick Webb, the author of the fantastic article “I’m Sorry Mr. Zuckerberg, But You Are Wrong“. In this piece, Webb disagrees with Zuckerberg’s assertion that fake news spread on Facebook didn’t influence the outcome of the election. I asked Webb if he thought the crackdown on fake news would have an effect, and he did:

Those two platforms especially are very well suited to the propagation of fake news in a way other platforms aren’t, and it would be much harder for fake news to spread if they enthusiastically cracked down. As to the larger effect that had on our society, it would be hard to feel, but over time I think it would have a bit of an effect, yes. — Rick Webb

He was quick to point out, however, that it’s not just fake news that’s causing problems. It’s also that Facebook “exacerbates the trends in news to writing hyped articles to attract traffic.” Publishers are rewarded for getting clicks and shares — two things that clickbait is great at generating 10 Sure Signs You've Fallen For Click Bait 10 Sure Signs You've Fallen For Click Bait With The Onion's new satirical publication, Clickhole, the social webs have finally taken a good look in the mirror. If you were unconscious to click bait, it's time you woke up. Read More . “People shared fake news before Facebook and they’ll continue to after,” he says, “but Facebook spreads it far beyond its past quarters: to users who aren’t already in a conspiracy mindset.”

Facebook also legitimizes fake, skewed, and over-hyped news with its brand, Webb says. Associating the Facebook brand with a news story gives it added credibility. This applies to sensationalized, hyper-partisan, and false news as much as it does to quality journalism. Facebook would do well to keep these issues in mind in its efforts to stamp out misinformation.

A Narrow Focus

I also got in touch with Sophie Bjork-James, a post-doctoral fellow in the Department of Anthropology at Vanderbilt University. She’s studied white nationalist movements, race relations, and conservative evangelical political life. She brought up an interesting point when I asked her about recent policy changes on Facebook and Twitter: that for it to work, it has to have the right goals in mind.

Despite the fact that the racist right represents a larger presence on social media than ISIS far more attention has been given to limiting social media use by ISIS. This is a problem given that the racist right is linked to more fatal attacks within the US than Islamic terrorism. — Sophie Bjork-James

If social platforms are going to make a positive difference by policing their content, they’re going to have to do it in a principled way. Addressing the villain du jour might not be enough. But then there are questions of interpretation. What constitutes hate speech? What should be protected by free speech rights? These are very difficult questions, and anyone’s answers may depend on their political leanings.

Still, Bjork-James does think that these efforts could have positive effects. She gave the hypothetical example of Twitter banning anti-Semitic accounts resulting in less harassment directed towards Jewish journalists, a trend that’s becoming increasingly prevalent. Even if the people who get banned head to other social networks, as many alt-right commentators have moved to Gab, Twitter would become a better social space for everyone.

The people that I talked to all seemed to agree that addressing hate speech was a good thing, and would likely have a positive effect. But they also thought that these policies aren’t not enough to solve the problems that underlie hate speech. Racism, sexism, xenophobia, and other discriminatory mindsets are deep-seated, and require a great deal of cultural force to address.

But if these three experts are hopeful that Facebook and Twitter’s changes could make a difference, that’s cause to be optimistic.

Looking Forward

Facebook and Twitter haven’t started policing their content in earnest yet. Facebook is working on algorithms to identify fake news stories. Twitter has started banning some accounts and has deployed better tools for reporting abuse. We’ll see if their plans will be effective. It will likely come down to just how zealous they are in pursuing these goals.

I agree with Webb when he says that he’s “skeptical that their efforts will be [super] enthusiastic.” With the reputation that both sites have of being rather hands-off, it’s hard to imagine them suddenly changing gears and doing everything they can to get abuse and misleading information off of their sites. On the other hand, much of Silicon Valley is upset about the recent presidential election, and political dissatisfaction is great for galvanizing change.

silicon valley election

But the particulars of any sort of campaign like this are always going to be difficult. Who decides which accounts to ban? How does the service control for bias? Who’s checking for abuse of the system? What constitutes misinformation? (This is a particularly difficult question.) How many people are Facebook and Twitter willing to employ and pay to police content? Running an effective anti-misinformation or anti-harassment program is time-consuming and expensive.

Personally, I’m cautiously optimistic. I think effort taken to clean up the internet is worth it. It’s never an easy or especially transparent process, but it’s an endeavor worth undertaking.

What do you think? Should social networks make an effort to police the content being shared on their platforms? Or does free speech trump all? I’m conflicted myself, and I’d love to hear from you. Let’s hash it out in the comments.

Image Credits: Ollyy/Shutterstock

Leave a Reply

Your email address will not be published. Required fields are marked *

  1. Blue Anvil
    December 12, 2016 at 5:05 pm

    Fake News is flawed liberal logic. It's a ploy to cover-up much of what they're doing.
    Only 6% of the people trust mainstream media. MSM are liars. There's your fake news! LOL
    Sorry to burst bubbles, but pizzagate is NOT fake news. Makes you sick to your stomach what some elites thinks is acceptable just b/c "they're elites".
    Once you have a good idea on what the libs goals are, its really very easy to extract what's fake and what they WANT you to think is fake.
    PizzaGate=REAL...Not sure exactly what all is going on, but something IS going on. Benghazi was "fake news" when Hillary tried to say it was the result of a youtube video. She's the QUEEN of fake news.

    • Dann Albright
      December 14, 2016 at 8:12 pm

      Where'd you get the 6% figure? I've never heard that before.

  2. Christian Cawley
    December 12, 2016 at 4:55 pm

    As long as the legacy media continues to push lies as genuine news, social networks will be unable to filter out "fake news", IMHO.

    • Dann Albright
      December 14, 2016 at 8:12 pm

      That's a good point. And I don't know that there's an answer to that. Legacy media makes a lot of money sensationalizing and reporting things long before anyone has any idea of what actually happened. Bad reporting and fake news are two different things, but there's definitely a lot of overlap there, and disentangling them (especially algorithmically) is going to be especially difficult. If not impossible.

  3. Krishna
    December 12, 2016 at 3:39 pm

    Bad idea for social networks to get rid of "fake" news. Who gets to determine that? What about the "fake" polls that said Hillary was winning? What about the "fake" debates where Hillary was given at least two questions ahead of time. It is dangerous to setup a person or body to determine what is fake and what is real. This is how dictatorships start. Let the people use their own brains and figure it out.

    • Dann Albright
      December 14, 2016 at 8:10 pm

      "Who gets to determine that?" is exactly the issue people are dealing with right now. It's not an easy question, and there's no simple answer. As for Facebook, we'll just have to see. I believe they're looking into using AI to do it, but that comes with a bunch of its own issues.

      • Krishna
        December 20, 2016 at 8:50 pm

        The legacy media (fox, abc, cnn, nbc, and cbs) have reported plenty of fake news and even irresponsible news. If we start policing fake news, who gets to police them. In fact, with the internet, we get to democratize news instead of handing it over to a small group of news agencies that are controlled by billionaires. Yes, there are some irresponsible people posting ridiculous things on the internet, but there are also a lot responsible people challenging and correcting them. There is a lot of self correction on the internet. With the legacy media, there really isn't any form of self correction. Historically, the average person really didn't have a way challenge the legacy media. They didn't have a resources or platform to do so. But today, if the media tells us we need to go impose a no fly zone in Syria, we can actually hear the other side. We actually hear that the so called "rebels" are ALSO committing mass murder in Syria. And if we didn't have the internet, we would never hear the other side's point of view. And probably would be engaged in another ridiculous middle eastern war that we should never be part of.

        • Dann Albright
          December 28, 2016 at 7:22 pm

          That's a very good point; the internet does allow us access to many different kinds of media organizations. And even legacy media has to answer to its readers and viewers online. Its strength on TV and print is still important, but we're given the chance to communicate with them online. To answer your question about who will be policing them, I think they'll probably continue on as they always have. They'll continue reporting, sometimes get things wrong, and sometimes correct themselves. How efforts to curb fake and highly biased news will affect that will certainly be interesting to watch!

  4. Heimen Stoffels
    December 12, 2016 at 10:22 am

    I'm actually more interested in what they will take into account when analyzing a story. For example, here in the Netherlands we have a fake news website called De Speld. But they write their articles in such a way that everyone knows they're doing it just for fun. But technically it's still fake, so will they ban that as well from Facebook and Twitter?
    Also, what to do about controversial news? I mean: for example, Dutch news network RTL4 recently dived into the national archives and found some hidden money deal established a long time ago b/w our government and the king/queen. That deal exists because the papers in the archives are real. But the king and government deny everything. So who are Facebook and Twitter going to believe (in the future): a news network with proof or the very influential king+government?

    • Dann Albright
      December 14, 2016 at 8:10 pm

      That's an interesting situation you've described. My guess is that news stories that are controversial or have multiple sides won't be the ones persecuted here. If a story said "King and Queen unquestionably prove that hidden money deal is false!", though, that might get flagged. (If, of course, that's just not true.) That's why this is such a difficult issue; there are degrees of truth, and differences of opinion, and it's going to be really hard to take all of that into account.

  5. Howard A Pearce
    December 11, 2016 at 11:37 am

    "Facebook has committed to identifying fake news posts and taking action on them"

    I wish one of these article I see online would simply define "fake" news for us and give us a summary of how to identify it and determine it rather than pushing a concept on its readers by assuming the concept is valid - like MUD and other so-called "technical" sites have done

    Will this new method mean I will not longer have to make a decision myself ? And that the geniuses at Facebook will determine for me what is fake or not so that I can forget about critical thinking ?

    The simple fact is that what is fake or not is an individual decision for the most part AND the sources one the person him/her self CHOOSES to rely on - not some authoritarian entity by the government or the propagandists at Facebook looking to push a view of what is Fake

    Hopefully competition for a free and open discussion come about and help take down authoritarian sites like Facebook

    AND Facebook MUST remove their claims of support freedom of speech if they gp this wa.

    Something I bet the hypocrites there will NEVER do

    I bet the idiots of Facebook support the idea of a FakeNews Communication Commission (FCC)

    • Dann Albright
      December 14, 2016 at 8:08 pm

      Defining "fake" is a really difficult issue, which is why few people are trying to throw out a quick definition. That's why this is generating so much discussion; because defining "fake news" and "misinformation" is extremely difficult. As for what's fake being an individual decision . . . I have to disagree. The veracity of the article saying that the Pope had endorsed Donald Trump doesn't really depend on how you feel about the Pope or Trump. It's just wrong. And I think those are the sorts of things that people are going to try to police a little more closely.

  6. Mike Tanis
    December 10, 2016 at 1:04 am

    The cure is going to be much worse than the disease. It's human nature to go farther and faster than necessary. And the enforcers of these policies will be earnest and resolved to make it appear they are earning their big salaries. So they will DO SOMETHING just to make a good appearance. It will eventually spin out of control until the companies become laughingstocks and smart CEOs realize they went too far.

    • Dann Albright
      December 14, 2016 at 8:06 pm

      Do you think their users will place a check on that inclination?

  7. BlueCritter
    December 10, 2016 at 12:49 am

    I deleted my Facebook account. Since fake news is more prevalent than real information, I now think participation in Facebook is a net negative.

    • Dann Albright
      December 14, 2016 at 8:05 pm

      I'd say it largely depends on what you use Facebook for. I don't tend to pay much attention to news (other than mountain bike news) on Facebook. I use it mostly for communicating with people that I don't text or email. But I definitely understand what you're saying. It's not always a great environment.