Work on the draft plan for the external oversight board has been ongoing by Facebook for the last six months and has been seeking feedback from more than 650 people at workshops in 88 countries. According to the company, that proposed external board would be functioning as an independent court of appeals related to decision on content on the platform.
It should not only be the responsibility or the domain of the company alone to decide on the decisions that would be deemed to be acceptable on Facebook’s suite of social networks, the company’s Chief Executive Mark Zuckerberg has said earlier. Currently the company claims to have a total of about 2.4 billion users worldwide on all of its platforms.
The statement from Facebook said that a final decision on the charter of the external board would be finalized in August.
One of the major and broad agreements that were arrived at by the attendees at the workshops was that none of the employees of company should be part of the external board, according to the report that was published by Facebook recently. A large section of the attendees of those the consultation process also opined that no member of the external board should be removed by the company without showing any valid reason and that the definition of the reason of the cause should also be first defined well by Facebook.
Some of the other proposals that were common to many attendees of the consultation processes included the company should give power to the external board to choose its own cases to examine, all of the decision taken by the board should be treated as precedents for similar issues in the future and that the company should accord enough power to the board so that it is bale to influence the content policies on Facebook.
Concerns about the possibility of lack of independence of such a board – both by the state actors and the company itself, were expressed by many attendees.
There has been severe criticisms of Facebook for quite some time that it has done too little to prevent hate speech, incitements to violence, bullying and all a number of other forms of content which do not conform to the “community standards” set by Facebook itself for its platform.
Over the last one year however, the company has attempted to be stricter in terms of enforcement of those content standards and has hired more than 30,000 people tasked with regular monitoring of content. These measures were also aimed to focus on improving “safety and security” on the platforms.
(Adapted form FirstPost.com)