Ask a Behavioral Scientist: Nir Eyal on how the “Regret Test” Makes for Good Ethics and Good Business

This article is part of our special issue “Connected State of Mind,” which explores the impact of tech use on our behavior and relationships, and our “Ask a Behavioral Scientist” series. Have a behavioral science question? Ask it here.

Q: Technology is increasingly potent and integrated into people's lives. Do those building the technology need to think differently about how they “hook” users?

The history of technological innovation involves many unintended consequences. As the cultural theorist Paul Virilio said, “The invention of the ship was also the invention of the shipwreck.” We’re suffering through many of the unintended consequences of the information revolution. The same features that make a product engaging, entertaining, and user-friendly can also make it a time-wasting distraction for some and a full-blown addiction to others.

Tech makers should abandon Google’s vague motto, “Don’t be evil”—it’s too vague. The Golden Rule, “Do unto others as you would have them do unto you,” leaves too much room for rationalization. Instead, we ought to be saying, “Don’t do unto others what they would not want done to them.” I call this the regret test.

We can ask, “If people knew everything the product designer knows, would they still execute the intended behavior? Are they likely to regret doing this?” If users would regret taking the action, the technique fails the regret test and shouldn’t be built into the product, because it manipulated people into doing something they didn’t want to do. Getting people to do something they didn’t want to do is no longer persuasion—it’s coercion.

So how do we tell if people regret using a product? Simple! We ask them. Just as companies test potential features they’re considering rolling out, they could test whether a questionable tactic is something people would respond to favorably if they knew what was going to happen next. This testing concept isn’t new to the industry—product designers test new features all the time. But the regret test would insert one more ethical check by asking a representative sample of people if they would take an action knowing everything the designer knows is going to happen.

The beautiful thing about the regret test is that it could help weed out some of those unintended consequences, putting the brakes on unethical design practices before they go live to millions of users. The regret test could also be used for regular check-ins. Like many people, I’ve uninstalled distracting apps like Facebook from my phone because I regret having wasted time scrolling through my feed instead of being fully present with the people I care about. Wouldn’t it be in Facebook’s interest to know about people like me?

If any company, be it Facebook or another business, doesn’t listen to users who increasingly resent it for one reason or another, it risks more people ditching its service altogether. And that’s exactly why understanding regret is so important. Ignoring people who regret using your product is not only bad ethics but also bad for business.

Further Reading & Resources

  • Eyal, N. (2018). Want to Design User Behavior? Pass the 'Regret Test' First. Nirandfar.com. (Link)
  • Eyal. N. (2014). Hooked: How to Build Habit-Forming Products. New York, NY: Portfolio. (Link)
  • Eyal, N. (2016). The Morality of Manipulation. Nirandfar.com. (Link)
  • Eyal, N. (2016). Here’s How to Ethically Manipulate Other People. Nirandfar.com. (Link)
  • Habit Summit Conference (Link)