Facebook policing QAnon, Antifa, militia organizations


  • Facebook is cracking down on pages, groups, and Instagram accounts tied to the far-right QAnon conspiracy theory the social media giant announced on Wednesday.
  • The move is part of the platform’s broader action against “offline anarchist groups that support violent acts amidst protests, US-based militia organizations and QAnon.” That action ranges from limiting Facebook functionality to outright removal and losing access to Facebook.
  • QAnon supporters have used social media to spread false theories that elites, Democratic Party leaders, and the so-called “Deep State” are all conspiring on a variety of nefarious acts, from pedophilia to mind control. 
  • The conspiracy theory has become particularly popular among President Trump’s supporters. Trump has repeatedly accused social media companies, including Facebook, of censoring him and his supporters.
  • Visit Business Insider’s homepage for more stories.

Facebook is taking broad action on a variety of groups that it describes as posing “significant risks to public safety,” the company announced on Wednesday.

The list includes pages and groups tied to the far-right conspiracy theory QAnon, US-based militias, and “offline anarchist groups that support violent acts amidst protests” — the latter two groups, Facebook said, includes “some who may identify as Antifa.” 

In total, just shy of 2,000 groups and over 500 pages were removed, the company said, with more action to come.

“Under this policy expansion, we will impose restrictions to limit the spread of content from Facebook Pages, Groups and Instagram accounts,” the blog post said. “We will also remove Pages, Groups and Instagram accounts where we identify discussions of potential violence, including when they use veiled language and symbols particular to the movement to do so.”

With well over 2 billion users, Facebook is by far the largest social network in existence. But as the service continues to grow, the company that runs it has struggled, or outright refused, to moderate content.

A Wall Street Journal report from May revealed that executives, including CEO Mark Zuckerberg, declined to moderate the service even when faced with evidence that its algorithms “exploit the human brain’s attraction to divisiveness.”  

In one recent example, a study found that Facebook users who interacted with Holocaust denial content were subsequently suggested additional Holocaust denial content. The report found similar Holocaust denial content available through Reddit, Twitter, and YouTube — but Facebook was the only service that actively surfaced additional related content. 

As part of the action announced on Wednesday, Facebook will remove “Pages, Groups and Instagram accounts associated with these movements” from that suggestion algorithm altogether. The company also pledged to monitor future movements, and to take further action when necessary.

Got a tip? Contact Business Insider senior correspondent Ben Gilbert via email (bgilbert@businessinsider.com), or Twitter DM (@realbengilbert). We can keep sources anonymous. Use a non-work device to reach out. PR pitches by email only, please.





Source link