Platform Governance
Platform Governance
Robert Gorwa∗
Department of Politics and International Relations,
University of Oxford
Abstract
Following a host of high-profile scandals, the political influence
of platform companies (the global corporations that that operate on-
line ‘platforms’ such as Facebook, WhatsApp, YouTube, and many
other online services) is slowly being re-evaluated. Amidst growing
calls to regulate these companies and make them more democratically
accountable, and a host of policy interventions that are actively be-
ing pursued in Europe and beyond, a better understanding of how
platform practices, policies, and affordances (in effect, how platforms
govern) interact with the external political forces trying to shape those
practices and policies is needed. Building on digital media and com-
munication scholarship as well as governance literature from political
science and international relations, the aim of this article is to map an
interdisciplinary research agenda for platform governance, a concept
intended to capture the layers of governance relationships structuring
interactions between key parties in today’s platform society, includ-
ing platform companies, users, advertisers, governments, and other
political actors.
1 Introduction
Platform companies, the global corporations that operate Facebook, What-
sApp, YouTube, and many other online services, have become enmeshed in
∗
R. Gorwa, 2019. What is Platform Governance? Information, Communication &
Society: doi:10.1080/1369118X.2019.1573914. This post-print last updated Feb 11,
2019. Please consult the final version of record for page numbers and references.
1
virtually all aspects of contemporary life, from politics (Gillespie 2018b) and
labour relations (Srnicek 2016; Van Doorn 2017) to cultural production and
consumption (Nieborg and Poell 2018). Although the products and services
provided by a handful of large, predominantly American technology firms
were not long ago widely portrayed as inherently beneficial and democra-
tizing “liberation technologies” (Tucker et al. 2017), the influence of these
corporations in public and political life is slowly being re-evaluated following
multiple high-profile public scandals (Vaidhyanathan 2018). Today, scholars,
policymakers, and the public are increasingly attempting to understand the
complex political effects of Instagram, Google Search, Airbnb, Uber, Ama-
zon, and other platforms, amidst calls to make the companies that operate
them more democratically accountable (Suzor 2019).
This endeavor is the latest chapter in the effort to better understand the
multitude of complex socio-technical developments playing a role in mod-
ern life. Research documenting the growing impact of automated decision-
making in areas such as policing, finance, and health-care (Ananny and Craw-
ford 2018) has been coupled with a constant flow of “algorithmic war stories”
illustrating how bias and discrimination can be exhibited by these systems
(Edwards and Veale 2017, 14). A growing and increasingly well-organized
interdisciplinary community of social and computer scientists is responding
by proposing actionable frameworks for accountable algorithms and fair ma-
chine learning (Barocas, Hardt, and Narayanan 2018; Wallach 2014), and ex-
ploring how various desirable principles, such as “explainability” or broader
notions of legal “recourse,” could be enacted for algorithmic systems (Ustun,
Spangher, and Liu 2019; Wachter, Mittelstadt, and Russell 2017)
A similarly focused line of research into the governance of platform com-
panies has yet to be pursued, despite the even wider prevalence of what might
be termed ‘platform war stories.’ From the Cambridge Analytica scandal and
Russian interference in the 2016 US election to Facebook’s troubling role in
the recent Myanmar conflict, policymakers around the world are grappling
with what Pasquale (2016, p. 314) has called “the darker narrative of plat-
form capitalism.” Popular discourse has become increasingly suffused with
various regulatory proposals for curbing the power of the so-called ‘digital
giants,’ which range from calls to break up Facebook from its acquisitions In-
stagram and WhatsApp (Reich 2018) to arguments that platforms should be
legally responsible for content posted by users via their services (Syal 2017).
How should platforms be governed? There is no simple answer, because,
as Helberger and colleagues (2018) outline, the policy arena is fragmented,
2
with responsibility for the social and political role of platforms divided be-
tween the platform companies (as architects of online environments), users
(as individuals making decisions about their specific behaviour in an online
environment), and governments (as the entities setting the overall ground
rules for those interactions). Although key questions about the appropriate
balance of responsibilities between these actors remain unanswered, policy-
makers — at least in Europe — are keen to act, and Germany’s 2017 “Net-
work Enforcement” law (NetzDG), which makes companies liable for illegal
speech propagated via their services, has provided important precedent (and
to some, a warning sign) for possible government responses (Schulz 2018).
At stake are major issues of freedom of expression, political participation,
and democratic governance, with tremendous implications for the future of
digital communication. Yet we lack a shared vocabulary or framework for
approaching the oncoming challenges.
The goal of this article is to bring governance scholarship in conversation
with the emerging platform studies literature, and in doing so, contribute a
more precise understanding of what is now being informally called “platform
governance.” It proposes a framework explicitly mindful of power dynam-
ics and the effects of external political forces on the platform ecosystem,
and seeks to provide a first mapping of the ongoing, interdisciplinary re-
search landscape in a politically vital area that has yet to receive a concrete
definition or theoretical overview. In the first section, I outline platform
governance as a concept that refers to the layers of governance relationships
structuring interactions between key parties in today’s platform society, cap-
turing the growing body of work addressing the political effects of digital
platforms (governance by platforms), as well as the complex challenges that
the governance of platform companies presents. In the second section, three
governance ‘lenses’ currently being used to inform thinking about platform
regulation and policy are outlined, with a discussion of the questions and
challenges raised by each. The final section addresses the normative ques-
tion of how platforms should be governed going forward, and briefly discusses
the emergent guiding principles shaping the future platform policy landscape.
3
2 Theorizing Platform Governance
2.1 What is a Platform?
‘Platform’ is an ambiguous term, used differently by various scholarly com-
munities (Schwarz, 2017): there are subtle variations in the ways that plat-
forms are understood by computer scientists, economists, digital media schol-
ars, and lawyers. A full history could go back to the emergence of computer
networking, ARPANET, and early bulletin boards (Hauben and Hauben,
1997; Naughton, 2000), or even further back to the early emergence of com-
puting (Hicks, 2017), but an abbreviated history of platforms might begin in
California in the 1990s, as software developers began to conceptualize their
offerings as more than just narrow programs, but rather as flexible platforms
that enable code to be developed and deployed (Bogost and Montfort, 2007).
The term was then strategically deployed by certain companies, allowing
them to brand themselves as platforms that facilitate access to user-generated
content, but do not create it, and therefore should not be held liable for it
(Gillespie, 2010). In recent years, the term has been adopted as shorthand
both for the services provided by many technology companies, as well as the
companies themselves (Srnicek, 2016). In this article, I use platform com-
pany more generally to refer to the corporations that deploy a service (e.g.
Facebook and Alphabet) and platform to refer to those online, data-driven
apps and services (e.g. Facebook Messenger, Google Search, YouTube); see
Helmond (2015) for a more thorough discussion of history and definitions.
4
a much broader understanding of governance. This more flexible conception
engaged with the central question of “how global life is organized, struc-
tured, and regulated” (Barnett and Duvall 2004, 7), and sought to move
beyond singular state-centrism to better understand the power relationships
and conflicts that emergent 20th century (often corporate, private, or non-
state) governance structures could create or enforce. As Stoker (1998, p. 17)
put it, governance entails “creating the conditions for ordered rule and col-
lective action.” Therefore, it is more than just a capacity, but a specific and
complex network of interactions spanning different actors and behaviours.
Some academics studying online life have implicitly adopted similar un-
derstandings of governance. As Grimmelmann (2015, p. 47) has suggested,
robust systems of community moderation and management are effectively
“governance mechanisms” designed to “facilitate cooperation and prevent
abuse.” Digital media scholars argue that content policies, terms of service,
algorithms, interfaces, and other socio-technical regimes form the governance
mechanisms of today’s online infrastructures (Plantin et al. 2018). Platform
studies scholars show that platform services can significantly affect and me-
diate individual behavior (Bucher and Helmond 2018); therefore, platforms
engage in governance at the individual, user level (Gillespie 2015). But these
governance mechanisms are themselves shaped by the policy and regulatory
constraints on the corporate entities which deploy these platforms. For a
company like Facebook, this includes a host of American regulatory frame-
works, international regulatory frameworks for overseas operations, voluntary
compliance mechanisms like the Global Network Initiative (GNI) principles,
industry-wide voluntary partnerships for terrorist content (initiated at by
European Commission), and countless others. Platform governance is an ap-
proach necessitating an understanding of technical systems (platforms) and
an appreciation for the inherently global arena within which these platform
companies function. It acknowledges that, as digital media scholars have
noted (Gillespie 2018a), platforms are fundamentally political actors that
make important political decisions while engineering what has become the
global infrastructure of free expression; but it also acknowledges the other
half of the equation: that these private ‘governors’ (Klonick 2017) are them-
selves subject to governance on all fronts, and that their conduct of gover-
nance is directly informed by local, national, and supranational mechanisms
of governance.
The key actors in platform governance therefore include not only users
and platform companies, what Poell, Van Dijck, and Nieborg (2018) have
5
called “complementors” (the host of data-brokers, advertisers, developers,
and other parties that participate in a platform’s ecosystem), but also, cru-
cially, political actors including various branches of government, as well as
other stakeholders and advocacy groups (non-governmental privacy and dig-
ital rights groups, academics and researchers, and investigative journalists,
who all play a growing accountability function by scrutinizing the practices
of platform companies). This is not to say that these political forces (such as
state preferences) reign supreme; rather, I suggest that, following the insights
of global governance scholarship, “a wide variety of forms of governance exist
next to each other and that a hierarchy among these various mechanisms is
hard, if not impossible, to discern” (Dingwerth and Pattberg 2006, 192).
Acknowledging media and communications scholars that have done such
vital work in advancing the sociological, anthropological, and political eco-
nomic dimensions of platforms, I posit that to truly understand the power
relationships and governance structures underpinning contemporary forms
of “platform capitalism” (Srnicek 2016), one must also engage with the host
of political forces and political (f)actors affecting the platform ecosystem
in a variety of ways. As platform politics are becoming increasingly diffi-
cult to separate from global politics, an exploration of this kind of platform
governance must build on valuable scholarship conducted in various areas,
including research from digital media and internet studies, platform stud-
ies, political communication, technology policy and law, as well as political
science and international relations. Here, I survey insights from this scholar-
ship, grouping them under three arguments: platforms govern, platforms are
governed, and platform companies are companies.
6
which platform companies navigated the complex, and oftentimes contrasting
interests of various stakeholders (including users, advertisers, and regulators),
and showing how the term ‘platform’ had itself become a discursive, political
imaginary, Gillespie set the stage for work investigating how platforms ‘in-
tervene’ in everyday life, shaping the online experience and algorithmically
determining what information to make (in)visible (Bucher 2018; Gillespie
2010, 2015). Similarly, Van Dijck (2013, p. 104) specifically discussed how
different governance mechanisms affected “online sociality” as experienced
by users of Twitter, Flickr, and other services. As researchers explored how
best to theorize platforms and situate them critically within past scholar-
ship on social networks, forms of cultural and epistemic power became the
focus of researchers such as Langlois (2013, p. 93), who explores platforms
as “participatory media assemblages, whereby Facebook and Google become
conduits of governance.”
But even before the rise of platforms, work on “search politics” and “epis-
temic power” grappled with what are now the foundational questions of plat-
form governance: how search engines (and the companies that deploy them)
may shape knowledge and meaning, and thereby politics, society, and culture
(Hargittai 2007; Introna and Nissenbaum 2000). A group of scholars that one
could call an “Amsterdam School” of critical platform studies provides key
insights into platformization, defined as the “penetration of economic, gov-
ernmental, and infrastructural extensions of digital platforms” into cultural
practices (Nieborg and Poell 2018, 2). This approach focuses on ecosystems,
and the potentially powerful gatekeeping roles that platforms play by me-
diating relationships between various parties in the “platform society” (Van
Dijck et al., 2018, p. 5). Related work by these scholars has mapped the as-
sortment of communicative affordances displayed by different platforms, and
how they enable and constrain forms of user behaviour (Bucher and Hel-
mond 2018; Weltevrede and Borra 2016). As Lessig (2006, p. 1) famously
observed, “code is law,” and the design decisions made by the creator of an
online service effectively amount to a form of regulation.
The political implications of the algorithmic systems deployed by plat-
form companies are also of interest to digital media researchers, who, through
‘critical algorithm studies’ scholarship (Gillespie and Seaver 2015), have ex-
amined the increasing role that automated decision-making plays in contem-
porary life (Beer 2017; Burrell 2016). This literature intersects significantly
with the current public discourses around platform companies: as Caplan
and boyd (2018, p. 2) note, “conversations around algorithmic account-
7
ability often center on Facebook and Google.” Although this initially may
seem surprising (given the multiple other areas of public life now assailed
by artificial intelligence or ‘AI’), it must be recognized that platform com-
panies deploy what are likely the largest, most global, and most widely used
algorithmic systems in existence. These systems, due to their scale and (gen-
erally) public-facing nature, provide some outcomes that may be relatively
visible: while one may be uncertain about when one has been discriminated
against by an automated hiring classifier used by a potential employer, prob-
lematic YouTube autocomplete results or racist image tagging systems can
provide public examples of bias and discrimination. Therefore, work striv-
ing to understand how to “govern algorithms” (Barocas, Hood, and Ziewitz
2013; Ziewitz 2016) fundamentally implicates platforms. After all: how can
we strive for accountable algorithms, if the corporate entities that build and
deploy them are not fair, accountable, transparent, or ethical, and if they
seem to be entrenching, rather than combatting, existing social prejudices?
While this scholarship has illustrated how platforms govern users through
their design, architectures, assemblages of algorithms, and other technical
structures, the mounting explorations of how platforms interact with social
structures more broadly is also vital. Intersectional arguments that online
services can encode gender dynamics, class structures, and racism (Noble,
2018) provide an important depiction of how platform companies can en-
gage in governance at a broader level. How do the decisions made largely by
a homogeneous group of white elites in Silicon Valley affect different users
around the world? Nakamura, Bivens, and others have demonstrated how
the choice of architectures and other design decisions (such as sign-in pages
that present new users with a binary gender option) can entrench normative
judgments around gender, race, class, and sexuality (Bivens 2017; Nakamura
2013). By adopting a sociotechnical perspective, this scholarship highlights
the importance of internal dynamics within platform companies, and of em-
ployees and organizational structures within platform governance (e.g. the
lack of diversity within engineering teams has a major impact on the design
decisions those teams make).
Work that scrutinizes the specific practices of platform companies pro-
vides another important contribution. Political communication researchers
have assessed the growing role of technology firms as “active agents in po-
litical processes” through their direct collaboration with campaigns (Kreiss
and Mcgregor 2018, 155), and their troubled interactions with publishers
and news organizations (Bell et al., 2017; Nielsen and Ganter, 2017). A
8
growing body of work seeks to explore the practices and implications of com-
mercial content moderation and content policy (Roberts 2018) as the pro-
cesses through which platform companies set and enforce the rules governing
user speech (Gillespie 2018a; Suzor 2019). Researchers have thoughtfully
documented both how this culminates in governance at the user level and
how users themselves perceive and interact with these structures (Duguay,
Burgess, and Suzor 2018; Myers West 2018). This work builds on research in
human-computer interaction and community management, where “platform
governance” refers to the systems of rules, norms, and civic labour govern-
ing an online community (Matias and Mou 2018). Digital media, internet
studies, and communication scholarship forms the base of a platform gover-
nance approach: it provides an appreciation for the functions, affordances,
and politics of contemporary platforms, and illustrates the contours of how
platform companies currently govern user behavior. However, it could be
better contextualized within the broader context of platform companies as
corporate actors, and the contested global governance arena within which
they operate.
9
significant economic and political role the 19th century United States. But
there are further parallels to the post-WW2 period, where the emergence of
the international order and the slow advent of globalization allowed a group
of firms to grow into multinational corporations of unprecedented wealth and
size (Vernon 1977). As political economists such as Strange (1996) observed,
the corporations that drove economic growth and boosted national gross
domestic products also created many governance challenges, testing jurisdic-
tions and traditional forms of regulation. Keeping corporations accountable
became a pressing global governance problem as mounting evidence of their
repeated evasion of labor and environmental standards by large corporations
— especially extractive natural-resource based multinationals — emerged in
the 1970s and 80s (Keck and Sikkink, 2014; Ruggie, 2008).
The scholarship that arose to catalogue the relationships between global
corporations and other actors in domestic and international politics (D. A.
Fuchs 2007; Mikler 2018) is another important area which could therefore
contribute to future platform governance research. Despite their constant
invocation of the rhetoric of disruption and innovation, platform companies
function in many ways as traditional corporate actors. They are tremendous
lobbyists: in 2017, Google spent more on lobbying in Washington than any
other company (Shaban 2018), and Facebook has now on multiple occasions
hired lobbying firms to help discredit their competitors (Nicas and Rosenberg
2018).They minimize their tax burden with classic profit shifting techniques,
and they deploy contractors to keep their workforces small and relatively
inexpensive (Srnicek 2016). As corporations, platforms can be therefore gov-
erned along the lines of traditional multinational/global enterprises.
Decades ago, global advocacy groups organized against firms like Nestle,
‘naming and shaming’ them into more socially responsible business prac-
tices (Sikkink 1986). International organizations built soft forms of gover-
nance through codes of conduct and Corporate Social Responsibility net-
works (Ruggie, 2013). Activist shareholders sought to reform companies
from within, and employees within the firms made their voices heard to
push for change. There are clear parallels to today, where international
digital rights organizations (e.g. Privacy International), investigative jour-
nalists (e.g. ProPublica), and academics have become key actors creat-
ing public pressure for responsible platform governance. Furthermore, non-
governmental “social responsibility” mechanisms, first created for freedom of
expression issues (such as the GNI) are now potentially being revitalized for
the broader host of concerns that platform companies need to deal with to-
10
day (Kaye 2018). These processes will be contested, and provide no panacea,
as keeping global corporations accountable is no easy task (Ruggie, 2008).
However, future platform governance scholarship can learn from the past
as it seeks to devise new forms of corporate accountability suitable for the
business models of today’s data-driven platform corporations, and as it eval-
uates the emerging international structures, processes, or organizations that
can help fill loopholes in how platform companies are governed by national
legislation.
11
tures discussed in the previous section, platform companies are also poten-
tially governed internally. While technology corporations often have share-
holder structures that consolidating power within their executive (preventing
them from being fully accountable to their investors), it is important to re-
member that they are complex collections of individuals and interests as
opposed to unitary actors. Employees (either full-time employees, or the
‘gig’ workers that many platforms rely on) can unionize, organize for re-
sponsible product design, stage protests and walkouts, and try and steer a
company’s actions from the inside, either collectively or individually (Wood,
Lehdonvirta, and Graham 2018).
3.1 Self-Governance
The current dominant governance mode is often referred to as ‘self-governance’
or ‘self-regulation.’ This approach, enshrined through legislation like the US
Communications Decency Act and the EU E-Commerce Directive, limits
12
platform liability and results in a relatively laissez-faire relationship between
governing institutions and platform companies. Today, companies own and
operate what is often highly visible, highly trafficked “public” space, and
respond to third-party complaints about content (for reasons ranging from
intellectual property to national security). The companies are generally not
liable for what users do on a platform, as long as they take adequate steps to
redress third party ‘notice.’ In this governing mode, transparency is gener-
ally voluntary, and most platform decisions are made with minimal external
oversight (Suzor 2019). Since 2016, platform companies have implemented
multiple changes in response to public concern. These initiatives, which range
from new advertising tools to changes as to how they interact with political
campaigns, seem designed to head off possible avenues of regulation while
also effectively maintaining the highly-profitable status quo. In the past two
years, these self-regulatory improvements have prominently consisted of tech-
nical changes or tools, transparency efforts, or some combination of the two
(Gorwa and Garton Ash, forthcoming).
Platform companies insist that they can be accountable to their users
by slowly increasing transparency in numerous areas, such as content pol-
icy and advertising. After a group of U.S. Senators proposed the Honest
Ads Act, Facebook took many of the provisions of the act and pre-emptively
implemented them (Timmons, 2018). Facebook, Google, and Twitter now
require political advertisers to register or provide identification in certain
jurisdictions, and both Facebook and Google have built public-facing tools
where researchers or interested members of the public can see ads that are
being deployed, along with some information about who is paying for them,
and how much (Garton Ash, Gorwa, and Metaxa, 2019). Other major trans-
parency initiatives have been launched as part of this broader effort to regain
public trust. In April 2018, Facebook made the important step of releasing
public-facing internal guidelines for their “Community Standards,” the rules
that govern what the more than 2.2 billion monthly active users of Facebook
can post on the site (Bikert, 2018). They also launched an ongoing project
partnering with academics which hopes to create a reputable mechanism for
third party data access and independent research (King and Persily 2018).
As these changes illustrate, the self-governance model has numerous ad-
vantages. Through investigatory journalism, academic engagement, and pub-
lic advocacy, companies can be nudged in the right direction without com-
plex regulatory interventions (an especially challenging prospect when, due
to corporate secrecy and the inherently ‘black box’ nature of contemporary
13
platform companies, the true scope of many of the problems in today’s plat-
form ecosystem are not fully known). Platform companies can quickly make
specific interventions (such as requiring advertisers to register their identities
or combating hate speech through specific tweaks of their content policies)
far before legislation goes into effect. Additionally, by keeping key decisions
about free expression largely in the hands of online intermediaries, impor-
tant concerns about government censorship and suppression (including in less
democratic countries that may be keen to exert control over online environ-
ments) are assuaged (Kaye 2018; Keller 2018). But there are also important
limitations: voluntary arrangements rely on goodwill and provide limited re-
course in the case of non-compliance. Many recent transparency initiatives
are predominantly public-facing, providing useful tools for journalists and
interested members of the public, but arguably much less useful information
for regulators and investigators. Furthermore, these minor changes do little
to provoke systemic change or modify platform business models, which may
be fundamentally problematic and based on extractive surveillance and data
collection (C. Fuchs 2012; Zuboff 2015).
14
move beyond current voluntary transparency measures by legally mandating
comprehensive transparency reporting on operations in Germany (Keller,
forthcoming).
Privacy legislation is another lever: under the European Union’s 2016
General Data Protection Regulation (GDPR; in effect as of May 2018), plat-
forms are given clear requirements as to how they process personal data, with
the threat of enormous penalties for non-compliance (Golla 2017). GDPR
has other major stipulations about data portability, data protection by de-
sign, and informed consent that, if enforced, could have a resounding impact
on the platform ecosystem (Hildebrandt 2018). Although it remains un-
likely in the short-term, some have argued that the United States should
adopt similarly comprehensive privacy legislation (Cook, 2018). Perhaps
the most drastic measure, however, would involve the use of anti-trust law
in the United States. Although platform companies have already faced fines
from European competition authorities, some legal scholars have argued that
these companies should be broken up or prevented from making future ac-
quisitions (Pasquale 2018; Wu 2018). From this perspective, the argument is
that, despite their provision of ostensibly ‘free’ services (and therefore, the
lack of clear price discrimination), users are harmed by multiple forms of
anti-competitive behavior exhibited by platform companies (Khan, 2016).
3.3 Co-Governance
Steps towards ‘co-governance’ seek a third way between the two previous
approaches. In the short term, such models seek to provide some values of
democratic accountability without making extreme changes to the status quo.
Civil society organizations, for example, have advocated for some kind of or-
ganization that could perform multiple functions ranging from investigating
user complaints to creating ethical frameworks for platform companies, per-
haps modelled after international press councils which set codes of conduct
and standards for news organizations (ARTICLE 19 2018). Kaye (2018, p.
18), describes the possibility of various “ombudsman programmes or third-
party adjudication” systems to which users could address complaints and
seek redress, and outlines several historical organizations which could pro-
vide precedent. Much as the GNI brought an international group of civil
society organizations, academics, and other stakeholders together with plat-
form companies to establish best practices for the promotion of free expres-
sion (and a system of transparency reporting, third party audits, and other
15
mechanisms to help oversee those practices), a similar organization could be
formed to tackle recent concerns around disinformation, hate speech, privacy,
and more. One of the major developments since the GNI is not just that the
platform’s external conduct vis a vis governments must be scrutinized via
audits and other accountability mechanisms. Today, the internal behaviour
of the platforms themselves is of key public interest and demands increased
oversight and stakeholder engagement.
Other possible visions of co-governance involve more granular forms of
user participation in policy decisions: for instance, Gillespie (2018) describes
Facebook’s brief experiment with user voting on policy changes as a possible
model which need not have been cast aside so quickly. Will platforms seek to
bring in external community support into their governance practices in or-
der to gain legitimacy and trust? In November 2018, Zuckerberg announced
that the French government would be permitted to embed regulators into
content policy processes in the country, and that Facebook would create a
“supreme court” that would allow external appeal for content policy decisions
(Zuckerberg, 2018). While the outcome of these changes is yet to be seen,
they signal an understanding on Zuckerberg’s behalf that self-governance is
no longer seen as a satisfactory long-term solution by various governance
stakeholders. Co-governance could also provide even more radical options in
the long term, as it lends itself to a philosophy that leads away from major,
corporatized platforms and towards various platform cooperatives, decentral-
ized systems, and other forms of community self-management. While these
remain admittedly unlikely in the near-term (due to scale issues, network
effects, and other challenges), they could provide a more equitable and just
digital economy for users in the long term (Scholz 2016; Van Doorn 2017).
16
ulation’ as opposed to ‘rules-based regulation.’” A platform governance ap-
proach should therefore seek not only to understand the complex governance
relationships and public policy challenges in today’s platform society (Nash
et al. 2017), but also ask how these relationships can become more beneficial
for the multitude rather than the few.
Recent research has begun extending some possible guiding principles,
each with their relative merits and challenges. These include rights-based
legal approaches, such as international human rights law (Kaye 2018; Su-
zor, Van Geelen, and Myers West 2018) or American civil rights law (Cit-
ron 2009), which could possibly provide an avenue for platforms to better
ground policy decisions and build legitimacy. Other possible frameworks
include aspirational computer science principles for algorithmic systems, in-
cluding fairness, accountability, and transparency, ethics, and responsibility
(Diakopoulos et al. 2016); the developing area of “data justice,” which seeks
to transcend the focus on individual harms wrought by various data-driven
systems and instead apply holistic principles of social justice at a broader
level (Dencik, Hintz, and Cable 2016; Taylor 2017). Others could include
political science and governance studies principles of meaningful democratic
accountability and transparency (Fung 2013; Hood and Heald 2006), or Cor-
porate Social Responsibility and other past efforts to steer business in the
direction of human rights or other normative aims (Urban 2014). How can
these various values be enshrined through sensible legislation and account-
ability mechanisms? Principles, values, and imaginaries will be crucial for
grounding the future platform governance research agenda, and scholars will
need to be assertive with their ideas and their output. Change will not come
easily: platform companies have become influential political actors with an
obvious interest in preserving their dominant market positions. Creative
ideas will be needed to help ‘disrupt’ the ‘disruptors’ and introduce fairer,
more accountable, and more just forms of platform governance.
5 Acknowledgements
Many thanks to the participants in the AoIR2018 session on ‘Platform Gov-
ernance and Moderation’ for the feedback and critique that helped shape this
article from the onset, and to the special issue editors, Alison Harvey and
Mary Elizabeth Luka, for engaging so thoughtfully and carefully with not
only this article’s arguments but also its structure and style. I am further
17
indebted to Nicolas Suzor, Corinne Cath, Thomas Poell, Ralph Schroeder,
and the anonymous reviewers of ICS for taking the time to read and provide
excellent comments.
5.1 Funding
I would like to gratefully acknowledge the Social Science and Humanities
Research Council of Canada (SSHRC) and the Canadian Centennial Schol-
arship Fund for supporting my studies. As well, many thanks to Professor
Philip Howard and the European Research Council (grant number 648311)
for providing travel assistance that allowed me to attend AoIR in the first
place.
References
Ananny, M. and Crawford, K. (2018). Seeing without knowing: Limitations
of the transparency ideal and its application to algorithmic accountability.
New Media & Society, 20(3):973 – 989.
Barocas, S., Hardt, M., and Narayanan, A. (2018). Fairness and Machine
Learning.
18
Beer, D. (2017). The social power of algorithms. Information, Communica-
tion & Society, 20(1):1–13.
Bell, E. J., Owen, T., Brown, P. D., Hauka, C., and Rashidian, N. (2017).
The Platform Press: How Silicon Valley Reengineered Journalism. Tow
Center for Digital Journalism, Columbia University, New York.
Bivens, R. (2017). The gender binary will not be deprogrammed: Ten years
of coding gender on Facebook. New Media & Society, 19(6):880–898.
boyd, d. and Crawford, K. (2012). Critical Questions for Big Data. Infor-
mation, Communication & Society, 15(5):662–679.
19
Bucher, T. and Helmond, A. (2018). The Affordances of Social Media Plat-
forms. In Burgess, Jean, Marwick, A., and Poell, T., editors, The SAGE
Handbook of Social Media, pages 254–278. SAGE, London.
Chadwick, A. (2017). The Hybrid Media System: Politics and Power. Oxford
University Press, New York, 2nd edition.
Citron, D. K. and Wittes, B. (2017). The Internet Will Not Break: Denying
Bad Samaritans Sec. 230 Immunity. Fordham Law Review, 86:401.
Dencik, L., Hintz, A., and Cable, J. (2016). Towards data justice? The
ambiguity of anti-surveillance resistance in political activism. Big Data &
Society.
Diakopoulos, N., Friedler, S., Arenas, M., Barocas, S., Hay, M., Howe, B.,
Jagadish, H., Unsworth, K., Sahuguet, A., Venkatasubramanian, S., Wil-
son, C., Yu, C., and Zevenbergen, B. (2016). Principles for Accountable
Algorithms. Technical report, FAT/ML.
20
Edelman, B. (2017). Uber Can’t Be Fixed — It’s Time for Regulators to
Shut It Down. Harvard Business Review.
Edwards, L. and Veale, M. (2017). Slave to the Algorithm: Why a Right to
an Explanation Is Probably Not the Remedy You Are Looking for. Duke
Law & Technology Review, 16:18.
Fisher, A. (2018). Sex, Beer, and Coding: Inside Facebook’s Wild Early
Days in Palo Alto. Wired.
Flew, T. (2015). Social Media Governance. Social Media + Society,
1(1):2056305115578136.
Flichy, P. (2007). The Internet Imaginaire. MIT Press, Cambridge, MA.
Fuchs, C. (2012). The political economy of privacy on Facebook. Television
& New Media, 13(2):139–159.
Fuchs, D. A. (2007). Business Power in Global Governance. Lynne Rienner,
Boulder, CO.
Fukuyama, F. (2013). What is governance? Governance, 26(3):347–368.
Fung, A. (2013). Infotopia: Unleashing the democratic power of trans-
parency. Politics & Society, 41(2):183–212.
Garton Ash, T., Gorwa, R., and Metaxa, D. (2019). Glasnost! Nine Ways
Facebook Can Make Itself a Better Forum for Free Speech and Democracy.
Reuters Institute for the Study of Journalism, Oxford, UK.
Gasser, U. and Schulz, W. (2015). Governance of Online Intermediaries:
Observations from a Series of National Case Studies. SSRN Scholarly
Paper ID 2566364, Social Science Research Network, Rochester, NY.
Gillespie, T. (2010). The Politics of ’Platforms’. New Media & Society,
12(3):347–364.
Gillespie, T. (2015). Platforms Intervene. Social Media + Society,
1(1):2056305115580479.
Gillespie, T. (2018a). Custodians of the Internet: Platforms, Content Moder-
ation, and the Hidden Decisions That Shape Social Media. Yale University
Press, New Haven, CT.
21
Gillespie, T. (2018b). Regulation of and by Platforms. In Burgess, Jean,
Marwick, A., and Poell, T., editors, The SAGE Handbook of Social Media,
pages 254–278. SAGE, London.
Grimmelmann, J. (2008). The Google Dilemma. New York Law School Law
Review, 53:939.
Helberger, N., Pierson, J., and Poell, T. (2018). Governing online platforms:
From contested to cooperative responsibility. The Information Society,
34(1):1–14.
22
Hicks, M. (2017). Programmed Inequality: How Britain Discarded Women
Technologists and Lost Its Edge in Computing. MIT Press, Cambridge,
MA.
Hoffmann, A. L., Proferes, N., and Zimmer, M. (2018). “Making the world
more open and connected”: Mark Zuckerberg and the discursive construc-
tion of Facebook and its users. New Media & Society, 20(1):199–218.
Hood, C. and Heald, D., editors (2006). Transparency: The Key to Better
Governance? Oxford University Press, Oxford, UK.
Kahler, M., editor (2015). Networked Politics: Agency, Power, and Gover-
nance. Cornell University Press, Ithaca, NY.
Keller, D. (2018). Don’t Force Google to Export Other Countries’ Laws. The
New York Times.
23
Khan, L. and Vaheesan, S. (2017). Market Power and Inequality: The An-
titrust Counterrevolution and Its Discontents. Harvard Law & Policy Re-
view, 11:235.
Klonick, K. (2017). The new governors: The people, rules, and processes
governing online speech. Harvard Law Review, 131:1598.
Moore, M. and Tambini, D., editors (2018). Digital Dominance: The Power
of Google, Amazon, Facebook, and Apple. Oxford University Press, Oxford,
UK.
24
Myers West, S. (2018). Censored, suspended, shadowbanned: User interpre-
tations of content moderation on social media platforms. New Media &
Society.
Nakamura, L. (2013). Cybertypes: Race, Ethnicity, and Identity on the In-
ternet. Routledge, 2nd edition.
Napoli, P. and Caplan, R. (2017). Why media companies insist they’re not
media companies, why they’re wrong, and why it matters. First Monday,
22(5).
Nash, V., Bright, J., Margetts, H., and Lehdonvirta, V. (2017). Public Policy
in the Platform Society. Policy & Internet, 9(4):368–373.
Naughton, J. (2000). A Brief History of the Future: The Origins of the
Internet. Phoenix, London, UK.
Naughton, J. (2018). Platform Power and Responsibility in the Attention
Economy. In Moore, M. and Tambini, D., editors, Digital Dominance: The
Power of Google, Amazon, Facebook, and Apple, pages 371–396. Oxford
University Press, Oxford.
Nicas, J. and Rosenberg, M. (2018). A Look Inside the Tactics of Definers,
Facebook’s Attack Dog. The New York Times.
Nieborg, D. B. and Poell, T. (2018). The platformization of cultural pro-
duction: Theorizing the contingent cultural commodity. New Media &
Society.
Noble, S. U. (2018). Algorithms of Oppression: How Search Engines Rein-
force Racism. NYU Press, New York.
Nooren, P., van Gorp, N., van Eijk, N., and Fathaigh, R. O. (2018). Should
We Regulate Digital Platforms? A New Framework for Evaluating Policy
Options. Policy & Internet, 10(3):264–301.
Obar, J. and Wildman, S. (2015). Social Media Definition and the Gover-
nance Challenge: An Introduction to the Special Issue. Telecommunica-
tions Policy, 39(9):745–750.
Omer, C. (2014). Intermediary Liability for Harmful Speech: Lessons from
Abroad Notes. Harvard Journal of Law & Technology, 28:289–324.
25
O’Reilly, T. (2007). What Is Web 2.0: Design Patterns and Business Mod-
els for the Next Generation of Software. Communications & Strategies,
65(1):17–37.
Plantin, J.-C., Lagoze, C., Edwards, P. N., and Sandvig, C. (2018). Infras-
tructure studies meet platform studies in the age of Google and Facebook.
New Media & Society, 20(1):293–310.
Poell, T., Van Dijck, J., and Nieborg, D. (2018). Platform Power & Public
Value. In Selected Papers of Internet Research, Montreal, Canada.
Puppis, M. (2010). Media governance: A new concept for the analysis of me-
dia policy and regulation. Communication, Culture & Critique, 3(2):134–
149.
Reich, R. (2018). Break up Facebook (and while we’re at it, Google, Apple
and Amazon). The Guardian.
26
Ruggie, J. G. (2013). Just Business: Multinational Corporations and Human
Rights. W. W. Norton & Company, New York.
Shaban, H. (2018). Google for the first time outspent every other company
to influence Washington in 2017. Washington Post.
Suzor, N. (2019). Lawless: The Secret Rules That Govern Our Digital Lives.
Cambridge University Press, Cambridge, UK, draft edition.
Suzor, N., Van Geelen, T., and Myers West, S. (2018). Evaluating the legit-
imacy of platform governance: A review of research and a shared research
agenda. International Communication Gazette.
Syal, R. (2017). Make Facebook liable for content, says report on UK election
intimidation. The Guardian.
27
Taplin, J. (2017). Why is Google spending record sums on lobbying Wash-
ington? The Guardian.
Taylor, L. (2017). What is data justice? The case for connecting digital
rights and freedoms globally. Big Data & Society, 4(2):2053951717736335.
Tucker, J. A., Theocharis, Y., Roberts, M. E., and Barberá, P. (2017). From
Liberation to Turmoil: Social Media And Democracy. Journal of Democ-
racy, 28(4):46–59.
Tufekci, Z. (2015). Algorithmic Harms beyond Facebook and Google: Emer-
gent Challenges of Computational Agency. Colorado Technology Law Jour-
nal, 13(2):203–218.
Tufekci, Z. (2017). Twitter and Tear Gas: The Power and Fragility of Net-
worked Protest. Yale University Press, New Haven.
Urban, G., editor (2014). Corporations and Citizenship. University of Penn-
sylvania Press, Philadelphia, PA.
Ustun, B., Spangher, A., and Liu, Y. (2019). Actionable Recourse in Lin-
ear Classification. In ACM Conference on Fairness, Accountability, and
Transparency (FAT*’19), Atlanta, GA.
Vaidhyanathan, S. (2018). Antisocial Media: How Facebook Disconnects Us
and Undermines Democracy. Oxford University Press, Oxford, UK.
Van Dijck, J. and Nieborg, D. (2009). Wikinomics and its discontents: A
critical analysis of Web 2.0 business manifestos. New Media & Society,
11(5):855–874.
Van Dijck, J., Poell, T., and de Waal, M. (2018). The Platform Society:
Public Values in a Connective World. Oxford University Press, New York,
NY.
Van Doorn, N. (2017). Platform labor: On the gendered and racialized
exploitation of low-income service work in the ‘on-demand’ economy. In-
formation, Communication & Society, 20(6):898–914.
Van Doorn, N. (2018). A New Institution on the Block: Airbnb as Com-
munity Platform and Policy Entrepreneur. In Selected Papers of Internet
Research, Montreal, Canada.
28
Van Eeten, M. J. and Mueller, M. (2013). Where is the governance in Internet
governance? New Media & Society, 15(5):720–736.
Wallach, H. (2014). Big Data, Machine Learning, and the social sciences:
Fairness, Accountability, and Transparency. In Neural Information Pro-
cessing Systems Workshop on Fairness, Accountability, and Transparency
in Machine Learning, Montreal, Canada.
Wu, T. (2018). The Curse of Bigness: Antitrust in the New Gilded Age.
Columbia Global Reports, New York, NY.
Zittrain, J. (2008). The Future of the Internet and How to Stop It. Yale
University Press, New Haven, CT.
29