Your Undivided Attention Podcast

Technology companies are locked in an arms race to seize your attention, and that race is tearing apart our shared social fabric. In this inaugural podcast from the Center for Humane Technology, hosts Tristan Harris and Aza Raskin will expose the hidden designs that have the power to hijack our attention, manipulate our choices and destabilize our real world communities. They’ll explore what it means to become sophisticated about human nature, by interviewing hypnotists, magicians, experts on the dynamics of cults and election hacking and the powers of persuasion. How can we escape this unrelenting race to the bottom of the brain stem? Learn more with our new podcast, Your Undivided Attention.

Episode 4: Down the Rabbit Hole by Design

When we press play on a YouTube video, we set in motion an algorithm that taps all available data to find the next video that keeps us glued to the screen. Because of its advertising-based business model, YouTube’s top priority is not to help us learn to play the accordion, tie a bow tie, heal an injury, or see a new city — it’s to keep us staring at the screen for as long as possible, regardless of the content. This episode’s guest, AI expert Guillaume Chaslot, helped write YouTube’s recommendation engine and explains how those priorities spin up outrage, conspiracy theories and extremism. After leaving YouTube, Guillaume’s mission became shedding light on those hidden patterns on his website, AlgoTransparency.org, which tracks and publicizes YouTube recommendations for controversial content channels. Through his work, he encourages YouTube to take responsibility for the videos it promotes and aims to give viewers more control. [Download Transcript]

We’d love to hear your thoughts. Join a conversation on this episode July 11 or July 15.

Episode 3: With Great Power Comes… No Responsibility?

Aza sits down with Yaёl Eisenstat, a former CIA officer and a former advisor at the White House. When Yaёl noticed that Americans were having a harder and harder time finding common ground, she shifted her work from counter-extremism abroad to advising technology companies in the U.S. She believed as danger at home rose, her public sector experience could help fill a gap in Silicon Valley’s talent pool and chip away at the ways tech was contributing to polarization and election hacking. But when she joined Facebook in June 2018, things didn’t go as planned. Yaёl shares the lessons she learned and her perspective on government’s role in regulating tech, while Aza and Tristan raise questions about our relationships with these companies and the balance of power. [Download Transcript]

We’d love to hear your thoughts. Join a conversation on this episode here.

Episode 2: Should’ve Stayed in Vegas

In part two of our interview with cultural anthropologist Natasha Dow Schüll, author of Addiction by Design, we learn what gamblers are really after a lot of the time — it’s not money. And it’s the same thing we’re looking for when we mindlessly open up Facebook or Twitter. How can we design products so that we’re not taking advantage of these universal urges and vulnerabilities but using them to help us? Tristan, Aza, and Natasha explore ways we could shift our thinking about making and using technology. [Download Transcript]

Episode 1: What Happened in Vegas

Natasha Dow Schüll, author of Addiction by Design, has spent years studying how slot machines hold gamblers, spellbound, in an endless loop of play. She never imagined the addictive designs which she had first witnessed in Las Vegas, would go bounding into Silicon Valley and reappear on virtually every smartphone screen worldwide. In the first segment of this two-part interview, Natasha Dow Schüll offers a prescient warning to users and designers alike: How far can the attention economy go toward stealing another moment of your time? Farther than you might imagine. [Download Transcript]

Podcast Trailer

Technology is a force that’s pushing our culture in a certain direction. We can predict what that direction is, and we can steer it in a more humane direction.

Subscribe Now for Free