Facebook has published new documents describing the company's plans for a content oversight board, which will serve as a kind of Supreme Court for the platform. In a conversation with reporters on Tuesday, Facebook said it hopes to have the board fully staffed by the end of this year, and provided more details on how the board will operate and be governed.
In November last year, Mark Zuckerberg published a blog post outlining the plan to create virtually a Supreme Court for Facebook. When fully staffed, the agency will be responsible for judging appeals from users whose content has been removed from Facebook's platforms. It will also decide judgments on cases to which the company itself has referred.
"The board wants to advocate for our society ̵
Facebook has promised that the regulatory body will be operational by November of next year. Today, the company posted how members would be elected and how they would influence moderation on the company's platforms.
According to the Facebook charter released Tuesday, the board of supervisors will begin with at least 11 members and are "probably 40 members" when fully staffed. Each board member will serve no more than nine years, spread over three years. Although the board will be employed by a full-time employee to review submissions and conduct research, members' names and moderation decisions will also be made available in a public online database.
However, the board may also be divided into various "panels" to board members can remain anonymous if their safety is a concern in a specific case.
Facebook says it hopes to fill the board of control with people from a variety of backgrounds. "There will be a set of people who serves on this board that makes different people in that group uncomfortable, "said Facebook's director of governance and global aces Brent Harris to the reporters. "We believe that when we build the board and constitute the board and really represent the diversity of the composition of this institution, that this will actually become a function."
The Charter specifies a number of requirements for Board members for oversight, similar to the requirements for corporate governance." Members must not have actual or perceived conflicts of interest that may compromise their independent judgment and decisions, "the document reads. They must also have demonstrated" familiarity with digital content and governance issues, including free expression, social discourse, security, privacy and technology. "
Once the board is in place, content matters will be submitted by both Facebook users and the company itself, with the board providing the final wording on what matters to hear. "In the committee, the board will seek to assess issues that have the greatest potential for guiding future decisions and policies," the charter states.  To submit a complaint, a user must first have exhausted all complaints in the Facebook moderation system already in place. proper statements to argue for their specific moderation cases. Further decisions, such as whether users can personally testify, are still pending for the future board.
"The board's decision will be binding, even if I or someone on Facebook disagrees," Mark Zuckerberg said in a blog post Tuesday. "The board will use our values to inform their decisions and explain their reasons openly and in a way that protects people's privacy."
Although the decisions of the board will be made publicly available, the details of the disclosure are still unclear. Facebook said it had not yet decided how to balance the privacy of users whose content complaints are heard by the board and the openness of the board's decisions.