Digital Legitimacy Scorecard

This is a graphical version of the legitimacy toolkit described in Chapter 8 of Good Data.

The scorecard is intended to help evaluate the legitimacy of tech companies, and think through the legitimacy trade-offs of different policy options.

  1. A technology company’s authority to provide services ultimately derives from the state via the corporate charter, and is conditional on its promotion of public good [1].
  2. Questions of digital legitimacy arise when a technology company’s power is such that it bears on many millions of individuals.
  3. As legitimacy is a scalar concept, different aspects of a technology company’s operations can be thought of as producing “debits” and “credits”.  Digital legitimacy debits may be offset by digital legitimacy credits, and vice versa.
  4. Digital legitimacy debits and credits can be assessed along the following dimensions.
    • Input legitimacy dimensions:
      • The transparency of the company’s business model
      • The company’s corporate structure and governance arrangements
    • Output legitimacy dimensions:
      • The distributional consequences of the company’s activities
      • The extent to which the company’s services empower its users
      • The controls the company applies to prevent its services being abused
  5. A digital legitimacy “scorecard” can therefore be created, using negative and positive valences to denote debits (-) and credits (+).  With equal weighting given to each dimension, a double valence can be used to indicate a strong debit (–) or credit (++). A tilde indicates neither a debit nor a credit (~).  A company’s overall digital legitimacy is the sum of its debits and credits.
Legitimacy TypeDimensionDebits Credits
Input LegitimacyTransparency of Business Model–/-/~~/+/++
Output LegitimacyDistributional Consequences–/-/~~/+/++
 User Empowerment–/-/~~/+/++
 Controls on Abuses–/-/~~/+/++
Overall Score <Sum>+<Sum> 
Digital Legitimacy Scorecard – Good Data, Chapter 8
  1. Digital legitimacy debits and credits can be assigned to a technology company using the following criteria:
 Dimension–/-~+ / ++
Transparency of Business ModelOpaque. The company deliberately conceals how it makes money.  Data privacy is respected only to the minimum extent that the law requires (if at all).Clear on inspection. The company does not conceal how it makes money, but nor is this immediately evident to its users.  Data privacy is available, but must be activated by the user.  Clear.  How the company makes money is self-evident.  Data privacy is assured by default.
GovernanceGovernance arrangements concentrate control with executive management.  Users’ access to services is contingent on the will of executive management.Current norms of governance apply.  Non-executive board directors and/or an independent ombudsman can hold executive management to account for harms to users.Users enjoy services as an enforceable right and have the capacity to influence corporate governance.
Distributional ConsequencesRegressive.  The company’s activities result in a net transfer of resources from the less advantaged to the more advantaged.Neutral.  The company’s activities have little or no impact on the distribution of resources.Progressive.  The company’s activities result in a net transfer of resources from the more advantaged to the less advantaged.
User EmpowermentThe company’s services are disabling to its users and/or do not serve their stated purpose.The company’s services are enabling to its users and serve their stated purpose.The company’s services significantly amplify the capacities of its users.
Controls on AbusesControls are minimal or non-existent.  It is straightforward for any actor to use the company’s services to cause significant harm without risk to itself. Development of new services proceeds on the basis of “permissionless innovation”.Controls are robust. A malevolent actor requires great skill and commitment to overcome them.  There are checks on the roll-out of new services.Controls are robust, and continually evolve in anticipation of future threats of abuses. Development of new services follows the precautionary principle.
Digital Legitimacy Scoring Criteria – Good Data, Chapter 8
  1. A worked example of the Digital Legitimacy Scorecard for Facebook is as follows:
Transparency of Business Model~~Facebook’s users have choices over how their profile and behavioural data is used in advertising, although it cannot be assumed that all users have the capacity to make such choices freely.
Governance Facebook’s dual-class stock structure creates conditions of domination, as user benefits wholly depend on Mark Zuckerberg’s goodwill.
Distributional Consequences ++The funding of universal free services through advertising most benefits the least advantaged – that is, poorer Facebook users in the Global South.
User Empowerment +Power is widely distributed and can be mobilized by any Facebook user, with consequences as dramatic as the overthrow of repressive governments.  This is partly offset by the addictive qualities of Facebook’s applications.
Controls on Abuses There are inadequate controls on features – particularly Facebook Ads – which may therefore be instrumentalized for illegitimate political ends and/or to enable and inflict social cruelty.
Overall Digital Legitimacy~ Facebook is not illegitimate, but has plenty of scope to increase its legitimacy, particularly through governance changes and stronger controls on abuses.
Digital Legitimacy Scorecard: Facebook – Good Data, Chapter 8

[1] David Ciepley has established that historically “Corporations were chartered only if they promised a clear public benefit”, and argues for a revival of the view that they “bear heightened responsibilities toward the public”. The claim here is simply that technology companies, as chartered corporations, have these responsibilities. See Ciepley, D. (2013) “Beyond Public and Private: Toward a Political Theory of the Corporation”, The American Political Science Review, Vol. 107, No. 1 (February 2013), pp. 139-158

%d bloggers like this: