Mr. Harris Zooms to Washington

May 10, 2021

Back in January 2020, Tristan Harris went to Washington, D.C. to testify before the U.S. Congress on the harms of social media. A few weeks ago, he returned — virtually — for another hearing, Algorithms and Amplification: How Social Media Platforms’ Design Choices Shape Our Discourse and Our Minds. He testified alongside Dr. Joan Donovan, Research Director at the Harvard Kennedy School’s Shorenstein Center on Media Politics and Public Policy and the heads of policy from Facebook, YouTube and Twitter. The senators’ animated questioning demonstrated a deeper understanding of how these companies’ fundamental business models and design properties fuel hate and misinformation, and many of the lawmakers expressed a desire and willingness to take regulatory action. But there’s still room for a more focused conversation. “It’s not about whether they filter out bad content,” says Tristan, “but really whether the entire business model of capturing human performance is a good way to organize society.” In this episode, a follow-up to last year’s “Mr. Harris Goes to Washington,” Tristan and Aza Raskin debrief about what was different this time, and what work lies ahead to pave the way for effective policy.

Episode Highlights

Major Takeaways

  • In this hearing, members of the Senate Judiciary Subcommittee on Privacy, Technology and the Law demonstrated an evolved and increasingly sophisticated understanding of social media’s impact on our society. Several senators focused on the role of the tech platforms’ business model and design in shaping their decisions, and repeatedly redirected questioning back to that frame.
  • Lawmakers are politically ready and in some cases hungry for regulation and action, but they may need more guidance on policy specifics. Focusing solely on changes to Section 230 of the Communications Decency Act will be necessary but not sufficient to repair the harms caused by these tools.
  • In future hearings, the line of questioning should remain focused on the fundamental design and business model. Discussion of the steps the tech companies are taking to mitigate the spread of misinformation and toxic content is a diversion when they are generated by a broken business model.
  • The misinformation challenges in the U.S. may seem daunting, but the same tools are operating with even less oversight all around the world. The problems we’re facing are happening to a greater extent with more serious impacts in countries where language, political and economic stability, and infrastructure are all barriers to ensuring a safe social media environment.  
  • Is the future world going to be run by digital open societies or digital closed societies? Right now, in most open societies such as the U.S., our digital infrastructure is in many ways debilitating our capacity to solve difficult problems. How do we entirely re-envision our digital infrastructure to create stronger open societies? What does “Open Society 2.0” look like in a post-digital age? 

Take Action

Share These Ideas