Amid an armed invasion of the Capitol building as Congress was meeting to affirm President-elect Joe Biden's election, social-media platforms including Facebook, Twitter and YouTube have been compeled to act against posts from outgoing President Donald Trump.
The posts in question seemingly sought to stoke actions from Trump-aligned protestors and continued to claim the presidential election had been "stolen." While Facebook banned Trump from posting for 24 hours, Twitter locked Trump's account for 12 hours, providing he removes certain Tweets.
Facebook declared an "emergency situation" and removed one of Trump's videos, as did YouTube. Thursday morning, Facebook banned Trump from the platform indefinitely.
Over the last few years, these platforms have had to keep pace with an explosion of misinformation from many sources, including Trump and his supporters, and have reacted in varying ways to try to retain the neutrality of their platforms. While Twitter has added a disclaimer to some Tweets, Trump's supporters from the far right have also seen their content taken down or had their accounts banned in the run-up to an acrimonious presidential election. Twitter has throughout Trump's term chosen not to suspend or revoke his personal account, even though it objectively violates the company's policies on a regular basis.
In regard to the ongoing situation in Washington, D.C., we are working proactively to protect the health of the public conversation occurring on the service and will take action on any content that violates the Twitter Rules.— Twitter Safety (@TwitterSafety) January 6, 2021
These moves mark some of the most stringent responses from the platforms, which have previously been accused of allowing the president too much leeway.
"We are appalled by the violence at the Capitol today ... our Elections Operations Center has already been active in anticipation of the Georgia elections and the vote by Congress to certify the election, and we are monitoring activity on our platform in real time," Guy Rosen, VP of integrity, and Monika Bickert, VP of global policy management for Facebook said in a blog post. "As a part of this, we removed from Facebook and Instagram the recent video of President Trump speaking about the protests and his subsequent post about the election results. We made the decision that on balance these posts contribute to, rather than diminish, the risk of ongoing violence."
"We will start removing any piece of content uploaded today (or anytime after) that misleads people by alleging that widespread fraud or errors changed the outcome of the 2020 US presidential election," YouTube also announced.
Facebook's strong statement didn't stop prominent voices, including Alex Stamos, its former security chief, from laying into the platform for being complict in allowing the radical flames to spread in the first place.
There have been good arguments for private companies to not silence elected officials, but all those arguments are predicated on the protection of constitutional governance.— Alex Stamos (@alexstamos) January 6, 2021
Twitter and Facebook have to cut him off. There are no legitimate equities left and labeling won't do it. pic.twitter.com/Nji6A4sJum
This story originally appeared on Campaign Asia.