This post will focus mainly on an article by Brooklyn Law School professor K. Sabeel Rahman that examines the power of digital platforms in an historical context. Though I try to summarize key points and include excerpts from the article, entited The New Octopus, I’d strongly recommend reading it in full to anyone interested in the issues addressed in this series of posts.
In the article’s introductory section, Rahman points out that issues of corporate power in the digital age have both similarities and differences in relation to those faced during the Progressive Era a century earlier. Citing Supreme Court Justice Brandeis’ famous reference to “the curse of bigness,” Rahman explains that:
As in the Progressive Era, technological revolutions have radically transformed our social, economic, and political life. Technology platforms, big data, AI—these are the modern infrastructures for today’s economy. And yet the question of what to do about technology is fraught, for these technological systems paradoxically evoke both bigness and diffusion: firms like Amazon and Alphabet and Apple are dominant, yet the internet and big data and AI are technologies that are by their very nature diffuse.
The problem, however, is not bigness per se. Even for Brandeisians, the central concern was power: the ability to arbitrarily influence the decisions and opportunities available to others.
New forms of concentrated power call for new remedies
The challenge then, says Rahman, is to develop strategies that can effectively counter excessive concentrations of power in the digital age.
The problem of scale, then, has always been a problem of power and contestability. In both our political and our economic life, arbitrary power is a threat to liberty. The remedy is the institutionalization of checks and balances. But where political checks and balances take a common set of forms—elections, the separation of powers—checks and balances for private corporate power have proven trickier to implement.
These various mechanisms—regulatory oversight, antitrust laws, corporate governance, and the countervailing power of organized labor— together helped create a relatively tame, and economically dynamic, twentieth-century economy. But today, as technology creates new kinds of power and new kinds of scale, new variations on these strategies may be needed.
Noting that “technological power today operates in distinctive ways that make it both more dangerous and potentially more difficult to contest,” Rahman cites three types of infrastructural power wielded by digital platforms: transmission, gatekeeping and scoring. Noting that the impacts of these forms of power “grow as more and more goods and services are built atop a particular platform,” he explains that its exercise is “more subtle than explicit control…enabl[ing] a firm to exercise tremendous influence over what might otherwise look like a decentralized and diffused system.”
Platforms wield transmission, gatekeeping & scoring power
As an example of transmission power, which Rahman describes as “the ability of a firm to control the flow of data or good,” he cites Amazon. “As a shipping and logistics infrastructure,” he explains, Amazon “can be seen as directly analogous to the railroads of the nineteenth century.” Its transmission power, he explains, “places Amazon in a unique position to target prices and influence search results in ways that maximize its returns,…favor its preferred producers” and even allow it to “make or break businesses and whole sectors, just like the railroads of yesteryear.”
With regard to gatekeeping power, Rahman explains that the exercise of this power does not require a firm to control the entire infrastructure of transmission, only to “control the gateway to an otherwise decentralized and diffuse landscape.” Examples of this kind of power, he says, are Facebook News Feed and Google Search.
[G]atekeeping power subordinates two kinds of users on either end of the “gate.” Content producers fear hidden or arbitrary changes to the algorithms for Google Search or the Facebook News Feed, whose mechanics can make the difference between the survival and destruction of media content producers. Meanwhile, end users unwittingly face an informational environment that is increasingly the product of these algorithms—which are optimized not to provide accuracy but to maximize user attention spent on the site. The result is a built-in incentive for platforms like Facebook or YouTube to feed users more content that confirms preexisting biases and provide more sensational versions of those biases, exacerbating the fragmentation of the public sphere into different “filter bubbles.”
Rahman goes on to explain that platforms’ gatekeeping decisions can have huge social and political consequences.
While the United States is only now grappling with concerns about online speech and the problems of polarization, radicalization, and misinformation, studies confirm that subtle changes—how Google ranks search results for candidates prior to an election, for instance, or the ways in which Facebook suggests to some users rather than others that they vote on Election Day—can produce significant changes in voting behavior, large enough to swing many elections.
The third type of power considered by Rahman is the scoring power “exercised by ratings systems, indices and ranking databases.” Citing socially destructive impacts of gamed credit ratings as an example, he notes that “scoring power is not a new phenomenon.” But he adds that “big data and the proliferation of AI enable…much wider use of similarly flawed scoring systems [and] as these systems become more widespread, their power—and risk—magnifies.”
In his book The Black Box Society: The Secret Algorithms That Control Money and Information,” University of Maryland law professor Frank Pasquale examines the potentially destructive asymmetry of power reflected in the expansion of algorithmic scoring in our increasingly monitored world. As he explains in the book’s introduction:
The success of individuals, businesses, and their products depends heavily on the synthesis of data and perceptions into reputation. In ever more settings, reputation is determined by secret algorithms processing inaccessible data…Although internet giants say their algorithms are scientific and neutral tools, it is very difficult to verify those claims. And while they have become critical economic infrastructure, trade secrecy law permits managers to hide their methodologies, and business practices, deflecting scrutiny.
Antitrust needs an updated framework to address platform power
The last section of Rahman’s article considers a range of possible approaches to constraining these infrastructural powers of platforms, including the use of antitrust regulations. As a potential example of the latter, he cites prohibitions on Amazon “being both a platform and a producer of its own goods and content sold on its own platform, as a way of preventing the incentive to self-deal.”
Among those advocating for stronger antitrust enforcement in the digital economy is the Open Markets Institute, headed by Barry Lynn, author of Cornered: The New Monopoly Capitalism and the Economics of Destruction. Shortly after the Cambridge Analytica revelations, the Institute proposed that Facebook be required to spin off its ad network as well as Instagram and WhatsApp, the two competing social networks it acquired between 2012 and 2014. It also recommended that Facebook be prohibited from making any further acquisitions for at least five years. And in a Yale Law Journal paper entitled Amazon’s Antitrust Paradox, Lina Khan, the Institute’s Director of Legal Policy, called for a new approach to antitrust in the digital age, arguing that “the current framework in antitrust—specifically its pegging competition to “consumer welfare,” defined as short-term price effects—is unequipped to capture the architecture of market power in the modern economy.” The final sections of Khan’s paper “consider two potential regimes for addressing Amazon’s power: restoring traditional antitrust and competition policy principles or applying common carrier obligations and duties.”
In a 2018 paper published in Telecommunications Policy, Natascha Just, Associate Professor in the Department of Media and Information at Michigan State University, also calls for an updated approach to antitrust, citing the challenges facing efforts to control “market dominance and anticompetitive behavior in times of platformization.” These challenges, Just suggests “are forcing a paradigm change in the area of competition policy.”
[T]heoretical advances and new market conditions require (1) a shift in attention from traditional price-oriented analyses to systematic inclusions of non-price competition factors like quality, innovation, and privacy, (2) due consideration of attention markets and the acknowledgement of markets in the absence of price, as well as (3) alertness to the role of user data and big data that has become a new asset class in digital economies.
Creating a civic infrastructure of checks & balances for the digital economy
In addition to antitrust, Rahman considered a range of other potential approaches to restraining the power of dominant platforms.
- The creation of independent oversight and ombudsman bodies within Facebook, Google and other tech platforms. To be effective and legitimate, Rahman says, these “would need to have significant autonomy and independence” and “engage a wider range of disciplines and stakeholders in their operations.”
- The development of “more explicit professional and industry standards of conduct,” perhaps facilitated by third-party scoring systems (e.g., similar to the LEED program that certifies green building practices).
- Creation of new interdisciplinary governmental institutions for oversight of “algorithms, the use of big data, search engines, and the like, subjecting them to risk assessments, audits, and some form of public participation.” The risk here, as with any government regulation, Rahman notes, is that “industry is likely to be several steps ahead of government, especially if it is incentivized to seek returns by bypassing regulatory constraints.” A related issue is that of regulatory capture, especially in a governmental system lacking strong safeguards against such capture.
- Privacy-related restrictions and/or a “big data tax” as “structural inhibitors of some kinds of big data and algorithmic uses.”
To close his article, Rahman ties these digital age issues of platform power back to the challenges and lessons of the Progressive Era:
A key theme for Progressive Era critics of corporate power was the confrontation between the democratic capacities of the public and the powers of private firms. Today, as technology creates new forms of power, we must also create new forms of countervailing civic power. We must build a new civic infrastructure that imposes new kinds of checks and balances…
Moving fast and breaking things is inevitable in moments of change. The issue is which things we are willing to break—and how broken we are willing to let them become. Moving fast may not be worth it if it means breaking the things upon which democracy depends.
In the following post I’ll be discussing Marjorie Kelly’s perspective on corporate power and governance. In my view, her diagnosis of the problem and suggested remedies overlap to a significant degree with the analysis presented in Rahman’s article. And both inform my own suggestions for strategies to constrain harms from the operation of digital platforms while encouraging their benefits. These suggestions are discussed in later posts in this series.
Below is an outline, with links, to all the posts in this series. Unless otherwise noted, bolding in quotations is mine, added for emphasis.
- Digital Platforms & Democratic Governance: Standing at an Historic Crossroads
- The digital anthropocene: a pivotal & high-risk phase of human history
- Empathy + technology: a powerful recipe for shared prosperity & peace
- More (and more effective) democracy as part of the solution
- The tech sector can help lead the next phase in democracy’s evolution
- The Facebook F-Up as a Wake-Up Call
- A growing awareness of problems
- Where to look for solutions?
- Serving Users (to Advertisers to Benefit Shareholders)
- An IPO + mobile ads: 2012 as a turning point for Facebook
- Too busy driving growth to focus on privacy?
- Serving users or serving users to advertisers?
- Understanding & addressing social harms
- Data as Power: Approaches to Righting the Balance
- Our data is tracked & locked in a “black box” we don’t control or understand
- The EU tightens privacy protections amid mixed signals in the U.S.
- Platforms as “information fiduciaries”
- Reallocating power & benefits when users share their data
- Shifting from an “Attention Economy” to a more efficient “Intention Economy”
- Who owns and controls the data used to develop AI?
- Data as labor that should be financially compensated
- Data as an infrastructural public good
- A “data tax” that generates a “data dividend” we all share
- Data portability as means to enhance competition & consumer choice
- The Power of Dominant Platforms: It’s Not Just About “Bigness”
- New forms of concentrated power call for new remedies
- Platforms wield transmission, gatekeeping & scoring power
- Antitrust needs an updated framework to address platform power
- Creating a civic infrastructure of checks & balances for the digital economy
- Democracy & Corporate Governance: Challenging the Divine Right of Capital
- A “generative” or “extractive” business model?
- Dethroning kings & capital
- Moving beyond capitalism’s aristocratic form
- Embracing economic democracy as a next-step Enlightenment
- Platform Cooperativism: Acknowledging the Rights of “Produsers”
- Reclaiming the Internet’s sharing & democratizing potential
- Scaling a platform co-op: easier said than done
- The #BuyTwitter campaign as a call for change
- Encouraging the wisdom of crowds or the fears of mobs?
- Interactions Between Political & Platform Systems
- Feedback loops reinforce strengths & weaknesses, benefits & harms
- Facebook’s role in the election as an example
- If we don’t fix government, can government help fix Facebook?
- A Purpose-Built Platform to Strengthen Democracy
- Is Zuck’s lofty vision compatible with Facebook’s business model?
- Designed to bolster democracy, not shareholder returns
- Democratic Oversight of Platform Management by “Produsers”
- Facebook, community and democracy
- Is Facebook a community or a dictatorship?
- Giving users a vote in Facebook’s governance
- Technology can help users participate in FB governance
- Evolving from corporate dictatorship toward digital democracy