Regulating Big Tech is now a matter of intense public debate. We ask how well Big Tech companies fulfill their role as gatekeepers of the public square. We ponder whether their dominant market positions merit an antitrust response. We assess their culpability and complicity in spreading online misinformation and hate. However, in the many normative debates over how Big Tech should use its power, the source of that power remains largely unexamined.
Big Tech, like Big Tobacco before it, is an industry founded on addiction. Although typically “free” to use, the world’s largest digital platforms exploit users’ dopamine pathways to consume as much of their time and conscious attention as possible. Many of the problems currently gripping the public consciousness would be fundamentally less important if these platforms were not powerfully addictive. They would be private problems—issues to be resolved between the company and its users. A key reason they are significant, public problems that society can no longer ignore is that Big Tech has intentionally addicted billions of people.
The business model of many large technology companies, or significant subsidiaries thereof, is built on maximizing the frequency and duration of use, what the industry refers to as “time on device.” In this, they have been remarkably successful. The average American spends just over 40% of their waking hours online, with that number approaching 60% for American teens. A growing body of research problematizes the “choice” to spend this much time online. Today’s dominant digital platforms are intentionally designed to produce structural and functional changes in various regions of the brain and to trigger the same brain reward pathways as nicotine and other addictive drugs. Thus far, such platforms have managed to almost entirely avoid liability for harms associated with their use.
This Article surveys existing tools that may help to combat Big Tech’s addictive design practices. It finds that existing laws and legal duties fail to protect users from exploitation. Accordingly, this Article proposes designating the largest manipulative technology platforms as “systemically important platforms.” Platforms so designated would be legally required to open their platforms to middleware, a type of software that can modify how data is presented. Such middleware would feature a control panel of tools that would enable users to curate their digital experience. In addition, these platforms would be subject to increased tax burdens and enhanced regulatory scrutiny with the goal of curbing manipulative design practices and providing billions of users with greater agency.
To read this Article, please click here: Systematically Important Platforms.