As I write this, Facebook is facing what is likely the most intense crisis of confidence in its roughly 14 years of existence. While obviously important for the company, this development also raises broad and increasingly urgent questions about the role of digital platforms in modern life, including their interactions with the principle and practice of democratic self-governance and the individual freedoms it relies upon and supports.
Facebook’s confidence crisis was triggered when it was discovered in 2017 that the world’s largest social network was used to distribute deeply dishonest and highly targeted negative advertising and other content during the 2016 campaign. Making matters worse was evidence that “fake news” had apparently been more widely distributed than actual news, and that some of the “fake news” and targeted negative ads came from a Moscow-based organization with ties to the Kremlin, as explained in a federal indictment issued in February 2018. This, in turn, contributed to events that led to a Special Counsel investigation into these Russian efforts to interfere with U.S. elections, including whether they involved cooperation from members of the Trump campaign. As one might expect, these developments have intensified the polarized acrimony and governmental dysfunction already deeply entrenched in the American political system.
The crisis in public confidence grew even more severe when a whistleblower revealed that Cambridge Analytica’s (CA), a UK-based data analytics and political consulting firm that had worked for the Trump campaign, had gained unauthorized (by the users) access to personal data on 50 million Facebook users (a figure Facebook later updated to 87 million). In the days that followed, Christopher Wylie—the young pink-haired whistleblower who had been deeply involved in CA’s efforts (funded and overseen by Steve Bannon and the Mercer family) to obtain and politically weaponize this huge cache of Facebook data—could be seen on virtually every news show, and was called to testify before the UK parliament. Making matters even worse for Facebook (and especially CA) was the release by the UK’s Channel 4 broadcast service of clandestinely recorded videos in which CA executives bragged about their role in the Trump campaign and others around the world, including how the company augmented its big-data targeting activities with more traditional campaign dirty tricks such as videotaped entrapments of opposing candidates with offers of financial or sexual favors.
These developments prompted an unprecedented wave of media and governmental scrutiny of Facebook’s business practices, including two days of congressional testimony by company founder and CEO Mark Zuckerberg. The CA revelations also triggered a Federal Trade Commission (FTC) investigation of whether Facebook had violated a 2011 FTC consent decree requiring it to better protect customer data privacy. The consent decree provided for penalties of up to $40,000 per violation, a potentially massive amount given that the unauthorized release involved data for 87 million Facebook users.
A growing awareness of problems
The sense of public outrage triggered by the Cambridge Analytica revelations was an intensified expression of long-simmering concerns about how Facebook, other online platforms, the government, and our increasingly digitized society as a whole, are managing a range of issues, including:
- the need for more effective privacy protections against unwanted surveillance by governments and private companies, and the use of personal data to misinform and/or manipulate online users as they engage economically as consumers and in civil society as citizens and voters;
- the enormous and still growing market power of online platforms, and existing and potential abuses of that power in ways that reduce healthy competition and choice in our economy and weaken the overall health of society;
- a hostile foreign government‘s use of digital platforms like Facebook to further weaken our already financially-corrupted and dysfunctional democracy, and its ability to undertake cyberattacks that threaten our power grid and other key infrastructure;
- the use of our ever-present digital devices to hack our brains by maximizing our “engagement” and using it to generate data and algorithms that support ever-more-effective triggering of the addictive user behaviors that drive many of today’s digital-economy business models, while generating social harms we need to better understand and address;
- the use of data collected by digital platforms and other entities to develop Artificial Intelligence (AI) systems, and the potential of these systems to cause a range of difficult-to-predict-and-control consequences, some on a massive scale of potentially harmful impact.
The Cambridge Analytica revelations also added to the body of evidence suggesting that the healthy function of democracy may be in decline in both the U.S. and the world, and that we face a growing risk of socially destructive dynamics along the following lines:
- weakened democracies will be unable to design and execute policies that effectively constrain (and may even aggravate) harms arising from what is likely to be an accelerating evolution of digital technologies;
- these harms will, as the Cambridge Analytica and Russian-interference revelations illustrate, feed a further degradation of already fragile systems of democratic self-governance;
- the result will be a destructive two-way feedback loop that aggravates rather than alleviates the many challenges we face in this country and the world.
Heightened awareness of this risk has, in turn fed:
- public fears that the optimistic vision of digital technology enhancing democracy and expanding freedom and opportunity for all may be replaced by a harsh reality characterized by further concentration and abuse of economic and political power and;
- a growing consensus that the evolution of digital platforms has reached a point where “move fast and break things” to achieve “domination,” and “get right up to the creepy line and not cross it” are no longer suitable steering mechanisms for the giant digital platforms that have carved out a central, powerful and growing role in modern society.
Where to look for solutions?
What is needed, but has yet to emerge, is a consensus about what guiding principles and governance systems should be applied to an industry that is extremely complex, dynamic, fast-growing, socially valuable, economically and politically powerful, and deeply entrenched in virtually every sector of modern life; and where all of these characteristics are likely to intensify in the future.
Their recent actions suggest that Congress and the FTC are at least temporarily taking more seriously their responsibilities to help address these challenges. But the former’s overall dysfunction and lack of technical expertise (as evidenced in recent hearings), and the latter’s failure to adequately enforce its consent decree until a whistleblower spoke out, raises serious questions about the extent to which they can be relied on to develop, update and enforce policies that constrain socially harmful tendencies of a tech sector that is extremely dynamic and increasingly central to the function and future evolution (or devolution) of our social, economic and political systems.
At the other extreme from faith in government regulation as the preferred remedial strategy is the combination of techno-utopian vision and entrepreneurial hubris driving the self-regulation approach favored by many tech industry leaders. In my view, Zuckerberg manifests one version of this perspective, along with the blind spots that seem to accompany it. With two billion users and a CEO who also is the controlling shareholder, it’s not too big a stretch to argue that Facebook operates as the world’s largest dictatorship. Yet Zuckerberg seems to view the company as a potential savior of global democracy, public safety and civil society. As he put it in a long February 16, 2017 Facebook post shortly after the 2016 presidential election:
“the most important thing we at Facebook can do is develop the social infrastructure to give people the power to build a global community that works for all of us…for supporting us, for keeping us safe, for informing us, for civic engagement, and for inclusion of all.”
Though revelations about Facebook’s privacy-related missteps over the years raise legitimate questions about its commitment to prioritize the lofty intentions reflected in this statement, I’m going to assume in this series of posts that Zuckerberg is sincere in expressing them. And that, regardless of the internal and external pressures he may feel that pull him and the company in other directions, he and his top management team would like their legacy to be Facebook’s success in achieving these admirable and important social goals.
With this in mind, later posts in this series will outline strategies that Facebook and other digital platforms can support (whether voluntarily or by government mandate) to help build the more inclusive, informed, safe and civically engaged global community envisioned in Zuckerberg’s early 2017 online essay. As these later posts will explain, the core of these strategies is:
- the application of democratic principles and functionality in the internal management of dominant digital platforms, coupled with a rebalancing of rights, power and benefits related to data, to better protect the welfare of platform users and reflect the value they contribute to these platforms through their usage;
- leveraging the financial and technical resources of digital platforms and the broader tech sector to develop one or more online platform specifically designed to improve the healthy function of political democracy.
To help provide context for these later posts, the next post in this series will provide a brief outline of Facebook’s history and the evolution of its business model.
********
Below is an outline, with links, to all the posts in this series. Unless otherwise noted, bolding in quotations is mine, added for emphasis.
- Digital Platforms & Democratic Governance: Standing at an Historic Crossroads
- The digital anthropocene: a pivotal & high-risk phase of human history
- Empathy + technology: a powerful recipe for shared prosperity & peace
- More (and more effective) democracy as part of the solution
- The tech sector can help lead the next phase in democracy’s evolution
- The Facebook F-Up as a Wake-Up Call
- A growing awareness of problems
- Where to look for solutions?
- Serving Users (to Advertisers to Benefit Shareholders)
- An IPO + mobile ads: 2012 as a turning point for Facebook
- Too busy driving growth to focus on privacy?
- Serving users or serving users to advertisers?
- Understanding & addressing social harms
- Data as Power: Approaches to Righting the Balance
- Our data is tracked & locked in a “black box” we don’t control or understand
- The EU tightens privacy protections amid mixed signals in the U.S
- Platforms as “information fiduciaries”
- Reallocating power & benefits when users share their data
- Shifting from an “Attention Economy” to a more efficient “Intention Economy”
- Who owns and controls the data used to develop AI?
- Data as labor that should be financially compensated
- Data as an infrastructural public good
- A “data tax” that generates a “data dividend” we all share
- Data portability as means to enhance competition & consumer choice
- The Power of Dominant Platforms: It’s Not Just About “Bigness”
- New forms of concentrated power call for new remedies
- Platforms wield transmission, gatekeeping & scoring power
- Antitrust needs an updated framework to address platform power
- Creating a civic infrastructure of checks & balances for the digital economy
- Democracy & Corporate Governance: Challenging the Divine Right of Capital
- A “generative” or “extractive” business model?
- Dethroning kings & capital
- Moving beyond capitalism’s aristocratic form
- Embracing economic democracy as a next-step Enlightenment
- Platform Cooperativism: Acknowledging the Rights of “Produsers”
- Reclaiming the Internet’s sharing & democratizing potential
- Scaling a platform co-op: easier said than done
- The #BuyTwitter campaign as a call for change
- Encouraging the wisdom of crowds or the fears of mobs?
- Interactions Between Political & Platform Systems
- Feedback loops reinforce strengths & weaknesses, benefits & harms
- Facebook’s role in the election as an example
- If we don’t fix government, can government help fix Facebook?
- A Purpose-Built Platform to Strengthen Democracy
- Is Zuck’s lofty vision compatible with Facebook’s business model?
- Designed to bolster democracy, not shareholder returns
- Democratic Oversight of Platform Management by “Produsers”
- Facebook, community and democracy
- Is Facebook a community or a dictatorship?
- Giving users a vote in Facebook’s governance
- Technology can help users participate in FB governance
- Evolving from corporate dictatorship toward digital democracy