Pinterest WhatsApp

It has been a tough few years for Facebook. Following Cambridge Analytica and the Russian interference in the 2016 election to ‘Definers-gate’, Myanmar, and a host of other crises, it is clear that, as Mark Zuckerberg has even now stated, ‘regulation is coming.’

Competition authorities, privacy regulators, and electoral commissions are all now grappling with the influence of big tech, but in the meantime, Facebook has begun implementing a series of much-needed policy changes and self-regulatory tweaks. In particular, transparency has emerged as a key means through which Facebook has attempted to regain the trust of the public, politicians, and regulatory authorities.

These efforts are clearly no substitute for effective regulation, but have had an immediate impact that is worth evaluating. With these changes, the company has entered an era, one that we, in a new report cross-published by the Universities of Oxford and Stanford, quip resembles a cautious “glasnost,” bringing academics, civil society, journalists, and others under the hood to try and understand various aspects of its operations, and how it formulates and implements its policies.  We sought to make the most of that access to assess some of the changes that the company has made in response to public pressure since the 2016 US Election.

Moderating alone

In our earliest interactions with the company in January 2018, we pushed policy higher-ups on the idea — advanced by UN Rapporteur David Kaye and others — that Facebook could not keep making highly political speech judgements in the long-term without engaging external expertise and perhaps global civil society. Let’s just say that it wasn’t received as a particularly popular or likely solution.

And yet, in an indication of how much changed in 2018, by November Mark Zuckerberg was announcing the creation of something like that: an “oversight body” that would engage stakeholders outside the company to create a level of external appeal on top of the company’s new general appeal process. (Users who have their content removed can now appeal that decision, and in our report, we discuss how Facebook could ensure that that appeals process is as useful as possible, following other recommendations made by civil society and researchers.)

Much remains to be seen on how this process works out (the company is holding global consultation sessions on this “oversight body” in the coming months), but bringing in other stakeholders could also help formalize how the company draws on subject-matter expertise. While the company does talk to academics and others when making its policies, it is currently done in a behind-the-scenes and opaque manner. Something like a formalized “advisory” group that would help the company work towards best practices would be a helpful step.

Give the people back some power!

Facebook would similarly do well to engage with some of its most important stakeholders — its users — when making changes to its NewsFeed algorithm. Users can do little in the face of tweaks, such as the company’s 2018 announcement that it was going to “refocus” it around friends and family content, pushing down the amount of news that the average person would see. Such design decisions have a large potential impact on issues like polarization and the spread of political information, yet the Facebook interface remains a “black-box” that does little to empower users who wish to engage with a broader, more diverse, and/or more trustworthy news diet.

While perhaps more complex from an engineering perspective, one of our most ambitious suggestions involves providing Facebook users with NewsFeed data and controls. For instance, such a function could show the percentage of their feed that comes from news outlets, and which ones; the percentage of their that comes from advertisers; the breakdown of political perspectives in their NewsFeed, and other features. It would allow users to increase, via sliders or some other interface, the amount and type of content they are exposed to. Want to bust out of your political bubble and diversify your media diet? Want to get more news, rather than less? Such a feature would help make passive consumers into more active and informed curators.

Continuous Plattform Governance Innovation

In her 2013 book Consent of the Networked, Rebecca Mackinnon foresaw many of the political challenges we face today, arguing for “governance innovation” that would help deal with issues around tech companies and freedom of expression, accountability, and transparency. We are starting to see long-overdue experiments in that direction, both internally within Facebook (e.g. the new moderation “Oversight Board”) and externally (see, for instance, the proposal for a “Social Media Council” proposed by the London-based NGO Article 19).

As companies continue to make small steps in the right direction, we must continue to advocate for better practices in the industry as a whole. With much public and regulatory attention in the past two years focused on Facebook, other firms, such as Instagram, YouTube, Twitter, and other major platforms would do well to formalize appeals, release detailed community guidelines, publish content policy enforcement reports, and initiate meaningful partnerships with civil society and academia.

If we want functional, comprehensive platform governance, we will need to do better, and advocate for the right balance of smart government regulation and industry-wide, platform-specific, and product-specific self-regulation. Our future depends on it.

This post is based on an article previously published in WIRED UK, and is based upon ‘GLASNOST! Nine ways Facebook can make itself a better forum for free speech and democracy,’ a report published by the Reuters Institute for the Study of Journalism, in partnership with the Free Speech Debate project at St. Antony’s College, the Project on Democracy and the Internet at Stanford University, and the Hoover Institution, Stanford University.



Previous post

Eyes Wide Shut: Seeing and Being Seen in Late Modernity

Next post