Facebook has removed a video of Donald Trump discussing the riots in Washington, DC, in what it said is an “emergency measure”.
The video was posted hours after the riots began and showed Mr Trump, seemingly standing outside the White House, urging those involved to go home.
But he also continued to promote false information about the election, told rioters that he loved them, and called them “very special”.
Facebook’s vice president for integrity, Guy Rosen, said that it had removed the video because it was likely to lead to more violence.
“This is an emergency situation and we are taking appropriate emergency measures, including removing President Trump’s video,” he posted on Twitter. “We removed it because on balance we believe it contributes to rather than diminishes the risk of ongoing violence.”
While the decision to remove the video was met with encouragement, the predominant response to Mr Rosen’s tweet was to ask why the account had not been removed entirely. Mr Trump’s account otherwise appears as normal, and includes a number of the posts that have been accused of helping to incite the violence that unfolded over Wednesday.
Users also noted that accounts that are pushing the “Stop the Steal” conspiracy theory that rioters appeared to be acting on were also easy to discover across Facebook.
Facebook’s move came as Twitter faced sustained criticism for allowing the video – and other inflammatory posts from Mr Trump’s account over the course of the day – to stay live on the site.
Twitter did say that it was “exploring other escalated enforcement actions”, but its actions had at the time of publication only included labels and limiting the reach of posts.
While it took the unprecedented step of limiting all interactions with the tweet, users were still able to see it. And despite claims that the post could not be retweeted, Mr Trump’s official POTUS account was able to re-share the video, and users found that it could be re-posted in a quote tweet.
“This claim of election fraud is disputed, and this Tweet can’t be replied to, Retweeted, or liked due to a risk of violence,” Twitter’s message under the post read.
A tweet that included a “risk of violence” would usually be removed from the site and would ordinarily come with punishment for the person posting them, such as a temporary ban.
“You may not threaten violence against an individual or a group of people,” Twitter’s terms state. “We also prohibit the glorification of violence.”
Twitter wrote in a series of posts from its “Safety” account that other threats of violence would be dealt with as usual.
“n regard to the ongoing situation in Washington, D.C., we are working proactively to protect the health of the public conversation occurring on the service and will take action on any content that violates the Twitter Rules,” it wrote.
“Threats of and calls to violence are against the Twitter Rules, and we are enforcing our policies accordingly.
“In addition, we have been significantly restricting engagement with Tweets labeled under our Civic Integrity Policy due to the risk of violence. This means these labeled Tweets will not be able to be replied to, Retweeted, or liked.
“We are also exploring other escalated enforcement actions and will keep the public updated with any significant developments.”