Principles of Humane Technology
ALPHA VERSION – MAY 2020
Important Note: This document was written prior to the COVID-19 pandemic. While society is changing rapidly, the need for humane product principles has never been clearer. This crisis is an unprecedented moment for technologists to critically examine how their work is truly serving humanity’s best interests.
Each day, technology plays a bigger role in the functioning of our institutions and social fabric, including shared truth, well-being, democracy, and our ability to tackle complex global challenges like COVID-19. In many contexts, technology companies wield even more power than governments.
The current worldwide pandemic reveals what has always been true: that our existing structures are not working well for the vast majority of citizens in many countries. Even before the pandemic, only 41% of Americans said they could cover a $1,000 emergency with their savings instead of going into debt.
We must not go back to business as usual—we urgently need new systems, and technology is one of our most powerful levers to build them quickly. Those of us who are fortunate to shape the vast reach, capabilities, and efficiency of technology must also embrace a commensurate level of responsibility. These Principles for Humane Technology are meant to help you deeply explore that responsibility to create products that are truly aligned with humanity’s best interests.
While these principles apply to many technologies, they are primarily intended for “persuasive technologies” that interact directly with the minds of users–most prominently, social media.
Our Upcoming Online Course
We’re developing an online course to help technologists become stronger advocates and implementors of humane technology. Sign up here to be notified when the course is available.
Before You Dive In
Before you dive into the principles, it is helpful to understand the context that drives our work.
🤲🏼 Technology is Never Neutral
We are constructing the social world. Learn more
Some technologists believe that technology is neutral. But in truth, it never is, for two reasons. First, our values and assumptions are baked into what we build. Anytime you put content or interface choices in front of a user you are influencing them; whether that is by selecting a default, choosing what content is shown and in what order, or providing a recommendation. Since it is impossible to present all available choices with equal priority, what you choose to emphasize is an expression of your values.
The second way technology is not neutral is that every single interaction a person has, whether with people or products, changes them. Even a hammer, which seems like a neutral tool, makes our arm stronger when we use it. Just like real-world architecture and urban planning influence how people feel and interact, digital technology shapes us online. For example, a social media environment of likes, comments, and shares shapes what we choose to post and reactions to our content shapes how we feel about what we posted. Neutrality is a myth.
Humanity’s current and future crises need your hands on the steering wheel.
🧠 Seeing in Terms of Human Nature
The human brain is inherently vulnerable. Learn more
To see the full implications of technology being values-laden, we must consider the vulnerabilities of the human brain. Many books have been written about the myriad cognitive biases evolution has left us with, and our tendency to overestimate our agency over them (see Resources). To quickly understand this, think of the last time you watched one more YouTube video than you had intended. YouTube’s recommendation algorithm is expert at figuring out what makes you keep watching—it doesn’t care what you intend to do with the next minutes of your life, let alone help you honor that intention.
Simple engagement metrics like watch time or clicks often fail to reveal a user’s true intent because of our many cognitive biases. When you ignore these biases, or optimize for engagement by taking advantage of them, a cascade of harms emerges.
Confirmation bias causes us to engage more with content that supports our views, leading to filter bubbles and the proliferation of fake news. Present bias, which prioritizes short-term gains, leads us to binge-watch as self-medication when we’re stressed instead of addressing the source of our stress. The need for social acceptance drives us to adopt toxic behavior we see others using in an online group, even when we would not normally behave that way.
Aggressively optimizing for engagement metrics is like taking your hand off the steering wheel. It puts the users’ paleolithic, inherently vulnerable brains in charge of determining what is valuable for your product. This approach, combined with the latest machine learning and A/B testing techniques, result in a broad series of harms unleashed at scale, which we call human downgrading.
🧭 Shifting Product Culture
Culture change is necessary and hard. Learn more
Our vision is to replace the current harmful assumptions that shape product development culture with a new mindset that will generate humane technology. Integrating this new paradigm will mean process changes, time, resources, and energy within the product organization and beyond.
We realize systemic cultural change is never an easy task, with many opposing forces. Please reach out if you have ideas for how to help move this change forward or specific requests that you think CHT may be positioned to fulfill.
🛒 Creating Market Conditions for Humane Technology
This paradigm shift requires a marketplace that rewards humane technology. Learn more
CHT and many other organizations are creating these conditions through a combination of pressures from the media, parents, kids, regulation, investors, shareholders, tech employees like you, and more.
For more on this, read about Our Work.
This new paradigm is for technologists who accept that technology is increasingly shaping our social fabric and want to apply their exceptional skills to realign technology with humanity.
✨ Obsess over Values
Instead of obsessing over engagement metrics. Learn more
🌱 Strengthen Existing Brilliance
Instead of assuming more technology is always the answer. Learn More
🤝 Make the Invisible Visceral
Instead of assuming harms are edge cases. Learn More
🧘 Enable Wise Choices
Instead of assuming more choice is always better. Learn More
♻️ Nurture Mindfulness
Instead of vying for attention. Learn More
⚖️ Bind Growth with Responsibility
Instead of simply maximizing growth. Learn More
Values Sensitive Design. How to put values at the center of your design process. Also a great card deck to help with red-teaming (the practice of proactively uncovering harms prior to delivering a product or feature).
Nudge. Guidance for how to Enable Wise Choices. Introduced the concept of “choice architecture” (additional support for the fact that technology is never neutral).
From Inform to Persuade. Blog post by Tristan Harris and Aza Raskin offering guidance on being intentionally persuasive instead of simply providing information during the COVID-19 pandemic.
Cognitive Biases Graphic. A comprehensive list of our inherent biases.
Greater Good Science Center. Science-based insights for a meaningful life. Inspiration and research to obsess over values and strengthen natural brilliance.
Design Guide. Our take on organizing common human vulnerabilities and a guide to assess your current product.
Want to share these principles?