Facebook Prepared to ‘Restrict’ Content Circulation if US Election Descends Into Online Chaos
Published: Sep 23, 2020 03:19 PM

Nick Clegg File photo: AFP

Facebook will take an array of measures to "restrict the circulation of content" if the November 3 US presidential election devolves into chaos and unrest, according to a Tuesday report by the Financial Times.

Former UK member of parliament Nick Clegg, Facebook's vice president for global affairs and communications risk management, told the Financial Times that risk management is one of the company's utmost concerns related to the upcoming presidential election.

Without getting into explicit details, Clegg noted that Facebook, which also owns the social media platform Instagram, has formulated a variety of plans based on possible unrest or other "political dilemmas," such as the counting of in-person ballots versus mail-in ballots.

Facebook has yet to expound on Clegg's comments.

US President Donald Trump has increased his use of other online platforms - like YouTube - after Twitter moved to flag several of his tweets, including one expressing his support of shooting American citizens as a response to looting in Minneapolis, Minnesota, earlier this year.

Facebook was criticized for its slowness to act by major outlets such as CNN Business, which ran the headline "Mark Zuckerberg silent as Trump uses Facebook and Instagram to threaten 'looting' will lead to 'shooting.'"

As it pertains to the upcoming November presidential election, Facebook CEO Mark Zuckerberg detailed in a September 3 Facebook post that "this election is not going to be business as usual."

"These changes reflect what we've learned from our elections work over the past four years and the conversations we've had with voting rights experts and our civil rights auditors," he clarified.

One of these changes is a policy that prohibits "new political ads in the week before the election."

Citing former Facebook executive Alex Stamos, the New York Times reported in August that Facebook, Twitter and YouTube faced a common risk-management situation in which they would "have to potentially treat the president as a bad actor" if he casts doubt on the legitimacy of the election.

"We don't have experience with that in the United States," added Stamos, who presently serves as director of Stanford University's Internet Observatory. Stamos was Facebook's security chief between 2015 and 2017, during the Cambridge Analytica scandal, in which some 87 million users had their personal information collected and used, most without their knowledge or consent, by companies intending on helping candidates like Trump shape elections in their favor.