Academics on the universities of Oxford and Stanford suppose Facebook ought to give customers better transparency and management over the content material they see on its platform.
They additionally imagine the social networking large ought to radically reform its governance buildings and processes to throw extra mild on content material selections, together with by looping in additional exterior consultants to steer coverage.
Such adjustments are wanted to tackle widespread considerations about Facebook’s affect on democracy and on free speech, they argue in a report revealed immediately, which features a collection of suggestions for reforming Facebook (entitled: Glasnost! Nine Ways Facebook Can Make Itself a Better Forum for Free Speech and Democracy.)
“There is a great deal that a platform like Facebook can do right now to address widespread public concerns, and to do more to honour its public interest responsibilities as well as international human rights norms,” writes lead creator Timothy Garton Ash.
“Executive decisions made by Facebook have major political, social, and cultural consequences around the world. A single small change to the News Feed algorithm, or to content policy, can have an impact that is both faster and wider than that of any single piece of national (or even EU-wide) legislation.”
Here’s a rundown of the report’s 9 suggestions:
Tighten Community Standards wording on hate speech — the lecturers argue that Facebook’s present wording on key areas is “overbroad, leading to erratic, inconsistent and often context-insensitive takedowns;” and in addition producing “a high proportion of contested cases.” Clear and tighter wording may make constant implementation simpler, they imagine.
Hire extra and contextually skilled content material reviewers — “the issue is quality as well as quantity,” the report factors out, urgent Facebook to rent extra human content material reviewers plus a layer of senior reviewers with “relevant cultural and political expertise;” and in addition to interact extra with trusted exterior sources reminiscent of NGOs. “It remains clear that AI will not resolve the issues with the deeply context-dependent judgements that need to be made in determining when, for example, hate speech becomes dangerous speech,” they write.
Increase “decisional transparency” — Facebook nonetheless doesn’t supply ample transparency round content material moderation insurance policies and practices, they counsel, arguing it must publish extra element on its procedures, together with particularly calling for the corporate to “post and widely publicize case studies” to offer customers with extra steering and to offer potential grounds for appeals.
Expand and enhance the appeals course of — additionally on appeals, the report recommends Facebook offers reviewers way more context round disputed items of content material, and in addition present appeals statistics knowledge to analysts and customers. “Under the current regime, the initial internal reviewer has very limited information about the individual who posted a piece of content, despite the importance of context for adjudicating appeals,” they write. “A Holocaust image has a very different significance when posted by a Holocaust survivor or by a Neo-Nazi.” They additionally counsel Facebook ought to work on growing “a more functional and usable for the average user” appeals due course of, in dialogue with customers — reminiscent of with the assistance of a content material coverage advisory group.
Provide significant News Feed controls for customers — the report suggests Facebook customers ought to have extra significant controls over what they see within the News Feed, with the authors dubbing present controls as “altogether inadequate,” and advocating for much extra. Such as the power to modify off the algorithmic feed fully (with out the chronological view being defaulted again to algorithm when the person reloads, as is the case now for anybody who switches away from the AI-controlled view). The report additionally suggests including a News Feed analytics characteristic, to provide customers a breakdown of sources they’re seeing and the way that compares with management teams of different customers. Facebook may additionally supply a button to let customers undertake a distinct perspective by exposing them to content material they don’t normally see, they counsel.
Expand context and fact-checking amenities — the report pushes for “significant” assets to be ploughed into figuring out “the best, most authoritative, and trusted sources” of contextual data for every nation, area and tradition — to assist feed Facebook’s present (however nonetheless insufficient and never universally distributed) fact-checking efforts.
Establish common auditing mechanisms — there have been some civil rights audits of Facebook’s processes (reminiscent of this one, which urged Facebook formalizes a human rights technique), however the report urges the corporate to open itself as much as extra of those, suggesting the mannequin of significant audits needs to be replicated and prolonged to different areas of public concern, together with privateness, algorithmic equity and bias, variety and extra.
Create an exterior content material coverage advisory group — key content material stakeholders from civil society, academia and journalism needs to be enlisted by Facebook for an skilled coverage advisory group to offer ongoing suggestions on its content material requirements and implementation; in addition to additionally to evaluation its appeals document. “Creating a body that has credibility with the extraordinarily wide geographical, cultural, and political range of Facebook users would be a major challenge, but a carefully chosen, formalized, expert advisory group would be a first step,” they write, noting that Facebook has begun transferring on this path however including: “These efforts should be formalized and expanded in a transparent manner.”
Establish an exterior appeals physique — the report additionally urges “independent, external” final management of Facebook’s content material coverage, by way of an appeals physique that sits outdoors the mothership and contains illustration from civil society and digital rights advocacy teams. The authors be aware Facebook is already flirting with this concept, citing feedback made by Mark Zuckerberg final November, but in addition warn this must be finished correctly if energy is to be “meaningfully” devolved. “Facebook should strive to make this appeals body as transparent as possible… and allow it to influence broad areas of content policy… not just rule on specific content takedowns,” they warn.
In conclusion, the report notes that the content material points it’s centered on will not be solely hooked up to Facebook’s enterprise however apply broadly throughout numerous web platforms — therefore rising curiosity in some type of “industry-wide self-regulatory body.” Though it means that reaching that sort of overarching regulation shall be “a long and complex task.”
In the in the meantime, the lecturers stay satisfied there’s “a great deal that a platform like Facebook can do right now to address widespread public concerns, and to do more to honour its public interest responsibilities, as well as international human rights norms” — with the corporate entrance and middle of the body given its huge dimension (2.2 billion+ energetic customers).
“We recognize that Facebook employees are making difficult, complex, contextual judgements every day, balancing competing interests, and not all those decisions will benefit from full transparency. But all would be better for more regular, active interchange with the worlds of academic research, investigative journalism, and civil society advocacy,” they add.
We’ve reached out to Facebook for touch upon their suggestions.
The report was ready by the Free Speech Debate mission of the Dahrendorf Programme for the Study of Freedom, St. Antony’s College, Oxford, in partnership with the Reuters Institute for the Study of Journalism, University of Oxford, the Project on Democracy and the Internet, Stanford University and the Hoover Institution, Stanford University.
Last 12 months we provided a number of of our personal concepts for fixing Facebook — together with suggesting the corporate rent orders of magnitude extra skilled content material reviewers, in addition to offering better transparency into key selections and processes.