On Wednesday, May 6, Facebook introduced what could be a defining initiative for two of the world’s largest social media platforms. After repeated criticisms over “fake news” and political misinformation, Facebook is diverting responsibility for addressing controversial content to a group of external experts.
The first 20 members of the board that CNN dubbed “Facebook’s supreme court” will initially be tasked with overseeing appeals over content that was flagged or taken down on Facebook and Instagram. The move is facing criticism both as a potential threat to free speech online and as a superficial way for Facebook to divert criticism.
“I wish I could say that the Facebook review board was cosmetic, but I’m not even sure that it’s that deep,” said Siva Vaidhyanathan, a professor of media studies at the University of Virginia and author of a book on Facebook.
Threat to free speech
The platform’s oversight board will start its activities with a limited mandate, but free speech advocates fear the board’s powers could grow with time.
While many consider a focus on xenophobia, misinformation, and “fake news” to be warranted, the implications of Facebook’s proposed board could have far-reaching consequences for small content creators, journalists, and news outlets.
“People having the power to express themselves at scale is a new kind of force in the world — a Fifth Estate alongside the other power structures of society,” Zuckerberg said in an October 2019 speech emphasizing the importance of freedom of expression.
“People no longer have to rely on traditional gatekeepers in politics or media to make their voices heard, and that has important consequences,” he added.
Many fear that the task set before the board is so momentous, as Facebook regularly encounters millions of problematic cases, that efforts to control “accurate information” will lead to an increased reliance on large corporate news outlets.
Facebook had earlier made moves that aimed to de-prioritize news coming from smaller news outlets, which led to a rise in content take-downs.
“Our goal is to reduce misinformation and reach people who’ve seen it with facts. Whether or not showing people misinformation they’ve previously seen is an effective method of achieving this is subject to debate within the research community,” a Facebook spokeswoman told STAT News. Her statement came in response to the challenges of addressing misinformation even with established intent to regulate content for the public good.
Averting responsibility
While the future application of the oversight board troubles some, others believe the board does not go far enough to stop problematic misinformation on Facebook and Instagram. By having the oversight board take responsibility for content, many believe Facebook is in effect distancing itself from the contentious job of filtering “truth” from “lies.”
Facebook’s role in the 2016 US presidential elections highlighted the difficulties in discerning fact from fiction. Russian meddling through Facebook advertisements similarly revealed the ease with which foreign actors could manipulate the platform’s focus on free speech to spread misinformation.
Vaidhyanathan, a professor of media studies at the University of Virginia, emphasized her conviction that the board is nothing more than a facade to divert responsibility, calling the move to create the oversight body “greenwashing.”
“If Facebook really wanted to take outside criticism seriously at any point in the past decade, it could have taken human activists seriously about problems in Myanmar; it could have taken journalists seriously about problems in the Philippines,” she argued.
Threat of personal bias
Facebook selected twenty established figures to form its initial board, but some fear the influence of its members’ personal ideologies could allow members to dangerously manipulate public narratives.
While members of the board are likely well-intentioned, the influence of personal biases from such a small group could skew the allowed discourse on platforms with billions of users worldwide.
One example of this dilemma is board member Tawakkol Karman, a world-renowned feminist activist and Nobel Prize winner who has garnered global admiration but also works for a Turkish news outlet controlled by Turkish President Recep Tayyip Erdogan.
While her voice on feminist issues and media freedom in the Middle East is well-respected, many fear that her views on the Muslim Brotherhood, which the US designates as a terrorist organization, and her financial links to the Turkish regime could influence reporting on these topics.
Facebook touts that the board members have lived in 27 countries and represent diverse opinions. Critics argue that appointing Karman as its “token” Arab member suggests the board falsely believes some of her extremist-linked opinions represent those of the Arab and Muslim populations as a whole.
Facebook would do well to shift its “focus on keeping extremist and terrorist content off their platforms,” said David Isben, executive director of the Counter Extremism Project (CEP).
Vanita Gupta, an outspoken Facebook critic and CEO of the Leadership Conference on Civil and Human Rights, expressed satisfaction at Facebook’s efforts. At the same time, she stressed she does not believe “that this kind of board is going to be a substitute for some kind of public law regime or regulatory regime to deal with some of the big-impact issues.”
Setting a precedent for objective moderation
The appointment of Alan Rusbridger, former editor of the left-leaning UK newspaper the Guardian, similarly created speculation that the board could be hostile to right-wing news. Prominent British right-wing tabloid the Telegraph on Wednesday complained about the board’s potential left-wing bias.
John Samples of the Koch-backed Cato Institute think tank also sits on the new board. While he personally believes that “conservatives who believe social media platforms are biased against them lack sufficient proof,” he did express hope that the public will allow the board room for trial and error as they move toward establishing true political legitimacy.
Samples stressed the importance of working through “inevitable mistakes” to set a valuable precedent for other social media platforms looking to justly moderate content.
“It is our ambition and goal that Facebook not decide elections, that it not be a force for one point of view over another, that the same rules will apply to people of left, right and center,” said US Senate Majority Leader Mitch McConnell, one of the board’s four co-chairs who selected its diverse members.
“It’s hard to develop a sense of legitimacy if decisions are not transparent and if people don’t trust that decisions are not being motivated by financial interest or political interest or reputational interest,” said Jamal Greene, another of the board’s co-chairs and professor of constitutional law at Columbia University.
“The board’s novelty is that it can mitigate some of those concerns by being set up to be independent,” Greene added. Board members will have no direct contact with Facebook executives and will even have the power to overrule Zuckerberg himself.