Facebook’s Oversight Board decided on Wednesday to uphold Facebook’s Jan. 7 ban of former President Donald Trump’s account, but noted that Facebook must review the decision within six months to make a final decision.
The Oversight Board concluded that Trump’s posts “created an environment where a serious risk of violence was possible,” but admitted that Facebook was wrong to “impose the indeterminate and standardless penalty of indefinite suspension” that went against its typical penalties.
The board also urged Facebook’s upcoming review of the matter “to determine and justify a proportionate response that is consistent with the rules that are applied to other users of its platform.”
Facebook CEO Mark Zuckerberg had announced Trump’s “indefinite” suspension from both Facebook and Instagram on Jan. 7, one day after violence broke out during the storming of the U.S. Capitol.
Zuckerberg alleged that Trump was using the platform “to condone rather than condemn the actions of his supporters at the Capitol.” However, the posts and videos Trump had posted – and Facebook flagged and removed — included calls for everyone at the Capitol to “go home” and remain peaceful.
“We have to have peace, so go home. We love you, you’re very special,” Trump said in one removed video. “You’ve seen what happens, you see the way others are treated that are so bad and so evil. I know how you feel, but go home and go home in peace.”
The Oversight Board cited the quote, claiming it and similar language Trump used in other posts “violated Facebook’s rules prohibiting praise or support of people engaged in violence.”
The full statement from Facebook’s Oversight Board is below:
The Board has upheld Facebook’s decision on January 7, 2021, to restrict then-President Donald Trump’s access to posting content on his Facebook page and Instagram account.
However, it was not appropriate for Facebook to impose the indeterminate and standardless penalty of indefinite suspension. Facebook’s normal penalties include removing the violating content, imposing a time-bound period of suspension, or permanently disabling the page and account.
The Board insists that Facebook review this matter to determine and justify a proportionate response that is consistent with the rules that are applied to other users of its platform. Facebook must complete its review of this matter within six months of the date of this decision. The Board also made policy recommendations for Facebook to implement in developing clear, necessary, and proportionate policies that promote public safety and respect freedom of expression.
About the case
Elections are a crucial part of democracy. On January 6, 2021, during the counting of the 2020 electoral votes, a mob forcibly entered the Capitol Building in Washington, D.C. This violence threatened the constitutional process. Five people died and many more were injured during the violence. During these events, then-President Donald Trump posted two pieces of content.
At 4:21 pm Eastern Standard Time, as the riot continued, Mr. Trump posted a video on Facebook and Instagram:
I know your pain. I know you’re hurt. We had an election that was stolen from us. It was a landslide election, and everyone knows it, especially the other side, but you have to go home now. We have to have peace. We have to have law and order. We have to respect our great people in law and order. We don’t want anybody hurt. It’s a very tough period of time. There’s never been a time like this where such a thing happened, where they could take it away from all of us, from me, from you, from our country. This was a fraudulent election, but we can’t play into the hands of these people. We have to have peace. So go home. We love you. You’re very special. You’ve seen what happens. You see the way others are treated that are so bad and so evil. I know how you feel. But go home and go home in peace.
At 5:41 pm Eastern Standard Time, Facebook removed this post for violating its Community Standard on Dangerous Individuals and Organizations.
At 6:07 pm Eastern Standard Time, as police were securing the Capitol, Mr. Trump posted a written statement on Facebook:
These are the things and events that happen when a sacred landslide election victory is so unceremoniously viciously stripped away from great patriots who have been badly unfairly treated for so long. Go home with love in peace. Remember this day forever!
At 6:15 pm Eastern Standard Time, Facebook removed this post for violating its Community Standard on Dangerous Individuals and Organizations. It also blocked Mr. Trump from posting on Facebook or Instagram for 24 hours.
On January 7, after further reviewing Mr. Trump’s posts, his recent communications off Facebook, and additional information about the severity of the violence at the Capitol, Facebook extended the block “indefinitely and for at least the next two weeks until the peaceful transition of power is complete.”
On January 20, with the inauguration of President Joe Biden, Mr. Trump ceased to be the president of the United States.
On January 21, Facebook announced it had referred this case to the Board. Facebook asked whether it correctly decided on January 7 to prohibit Mr. Trump’s access to posting content on Facebook and Instagram for an indefinite amount of time. The company also requested recommendations about suspensions when the user is a political leader.
In addition to the two posts on January 6, Facebook previously found five violations of its Community Standards in organic content posted on the Donald J. Trump Facebook page, three of which were within the last year. While the five violating posts were removed, no account-level sanctions were applied.
Key findings
The Board found that the two posts by Mr. Trump on January 6 severely violated Facebook’s Community Standards and Instagram’s Community Guidelines. “We love you. You’re very special” in the first post and “great patriots” and “remember this day forever” in the second post violated Facebook’s rules prohibiting praise or support of people engaged in violence.
The Board found that, in maintaining an unfounded narrative of electoral fraud and persistent calls to action, Mr. Trump created an environment where a serious risk of violence was possible. At the time of Mr. Trump’s posts, there was a clear, immediate risk of harm and his words of support for those involved in the riots legitimized their violent actions. As president, Mr. Trump had a high level of influence. The reach of his posts was large, with 35 million followers on Facebook and 24 million on Instagram.
Given the seriousness of the violations and the ongoing risk of violence, Facebook was justified in suspending Mr. Trump’s accounts on January 6 and extending that suspension on January 7.
However, it was not appropriate for Facebook to impose an ‘indefinite’ suspension.
It is not permissible for Facebook to keep a user off the platform for an undefined period, with no criteria for when or whether the account will be restored.
In applying this penalty, Facebook did not follow a clear, published procedure. ‘Indefinite’ suspensions are not described in the company’s content policies. Facebook’s normal penalties include removing the violating content, imposing a time-bound period of suspension, or permanently disabling the page and account.
It is Facebook’s role to create necessary and proportionate penalties that respond to severe violations of its content policies. The Board’s role is to ensure that Facebook’s rules and processes are consistent with its content policies, its values and its human rights commitments.
In applying a vague, standardless penalty and then referring this case to the Board to resolve, Facebook seeks to avoid its responsibilities. The Board declines Facebook’s request and insists that Facebook apply and justify a defined penalty.
The Oversight Board’s decision
The Oversight Board has upheld Facebook’s decision to suspend Mr. Trump’s access to post content on Facebook and Instagram on January 7, 2021. However, as Facebook suspended Mr. Trump’s accounts ‘indefinitely,’ the company must reassess this penalty.
Within six months of this decision, Facebook must reexamine the arbitrary penalty it imposed on January 7 and decide the appropriate penalty. This penalty must be based on the gravity of the violation and the prospect of future harm. It must also be consistent with Facebook’s rules for severe violations, which must, in turn, be clear, necessary and proportionate.
If Facebook decides to restore Mr. Trump’s accounts, the company should apply its rules to that decision, including any changes made in response to the Board’s policy recommendations below. In this scenario, Facebook must address any further violations promptly and in accordance with its established content policies.
A minority of the Board emphasized that Facebook should take steps to prevent the repetition of adverse human rights impacts and ensure that users who seek reinstatement after suspension recognize their wrongdoing and commit to observing the rules in the future.
When it referred this case to the Board, Facebook specifically requested “observations or recommendations from the Board about suspensions when the user is a political leader.”
In a policy advisory statement, the Board made a number of recommendations to guide Facebook’s policies in regard to serious risks of harm posed by political leaders and other influential figures.
The Board stated that it is not always useful to draw a firm distinction between political leaders and other influential users, recognizing that other users with large audiences can also contribute to serious risks of harm.
While the same rules should apply to all users, context matters when assessing the probability and imminence of harm. When posts by influential users pose a high probability of imminent harm, Facebook should act quickly to enforce its rules. Although Facebook explained that it did not apply its ‘newsworthiness’ allowance in this case, the Board called on Facebook to address widespread confusion about how decisions relating to influential users are made. The Board stressed that considerations of newsworthiness should not take priority when urgent action is needed to prevent significant harm.
Facebook should publicly explain the rules that it uses when it imposes account-level sanctions against influential users. These rules should ensure that when Facebook imposes a time-limited suspension on the account of an influential user to reduce the risk of significant harm, it will assess whether the risk has receded before the suspension ends. If Facebook identifies that the user poses a serious risk of inciting imminent violence, discrimination or other lawless action at that time, another time-bound suspension should be imposed when such measures are necessary to protect public safety and proportionate to the risk.
The Board noted that heads of state and other high officials of government can have a greater power to cause harm than other people. If a head of state or high government official has repeatedly posted messages that pose a risk of harm under international human rights norms, Facebook should suspend the account for a period sufficient to protect against imminent harm. Suspension periods should be long enough to deter misconduct and may, in appropriate cases, include account or page deletion.
In other recommendations, the Board proposed that Facebook:
Rapidly escalate content containing political speech from highly influential users to specialized staff who are familiar with the linguistic and political context. These staff should be insulated from political and economic interference, as well as undue influence.
Dedicate adequate resourcing and expertise to assess risks of harm from influential accounts globally.
Produce more information to help users understand and evaluate the process and criteria for applying the newsworthiness allowance, including how it applies to influential accounts. The company should also clearly explain the rationale, standards and processes of the cross check review, and report on the relative error rates of determinations made through cross check compared with ordinary enforcement procedures.
Undertake a comprehensive review of Facebook’s potential contribution to the narrative of electoral fraud and the exacerbated tensions that culminated in the violence in the United States on January 6. This should be an open reflection on the design and policy choices that Facebook has made that may allow its platform to be abused.
Make clear in its corporate human rights policy how it collects, preserves and, where appropriate, shares information to assist in investigation and potential prosecution of grave violations of international criminal, human rights and humanitarian law.
Explain its strikes and penalties process for restricting profiles, pages, groups and accounts in Facebook’s Community Standards and Instagram’s Community Guidelines.
Include the number of profile, page, and account restrictions in its transparency reporting, with information broken down by region and country.
Provide users with accessible information on how many violations, strikes and penalties have been assessed against them, and the consequences that will follow future violations.
Develop and publish a policy that governs Facebook’s response to crises or novel situations where its regular processes would not prevent or avoid imminent harm. This guidance should set appropriate parameters for such actions, including a requirement to review its decision within a fixed time.