Building the Behavior Change Toolkit: Designing and Testing a Nudge and a Boost

Changing behavior is challenging, so behavioral scientists and designers better have a large toolkit. Nudges—subtle changes to the choice environment that don’t remove options or offer a financial incentive—are perhaps the most widely used tool. But they’re not the only tool.

More recently, researchers have advocated a different type of behavioral intervention: boosting. In contrast to nudges, which aim to change behavior through changing the environment, boosts aim to empower individuals to better exert their own agency.

Underpinning each approach are different perspectives on how humans deal with bounded rationality—the idea that we don’t always behave in a way that aligns with our intentions because our decision-making is subject to biases and flaws.

A nudge approach generally assumes that bounded rationality is a constant, a fact of life. Therefore, to change behavior we best change the decision environment (the so-called choice architecture) to gently guide people into the desired direction. Boosting holds that bounded rationality is malleable and people can learn how to overcome their cognitive pitfalls. Therefore, to change behavior we must focus on the decision maker and increasing their agency.

In practice, a nudge and a boost can look quite similar, as we describe below. But their theoretical distinctions are important and useful for behavioral scientists and designers working on behavior change interventions, as each approach has pros and cons. For instance, one criticism of nudging is the paternalism part of Thaler and Sunstein’s “libertarian paternalism,” as some worry nudges remove autonomy of decision makers (though the extent to which nudges are paternalistic, and the extent to which this is solvable, are debated). Additionally, if the goal of an intervention isn’t just to change behavior but to change the cognitive process of the individual, then nudges aren’t likely to be the best tool. Boosts, in contrast, require some motivation and teachability on the part of the boostee, so there may well be contexts unfit for boosting interventions where nudges come in handy.

In contrast to nudges, which aim to change behavior through changing the environment, boosts aim to empower individuals to better exert their own agency.

While there have been a number of discussions and debates about nudging and boosting, we have hardly any scientific evidence on the implementation and the effectiveness of a nudge compared to a boost.

Recently, we conducted a field experiment to do just this. We tested and compared the effectiveness of both interventions in a preregistered quasi field experiment in a large Dutch hospital on a specific behavior: hand hygiene compliance among nurses.

Hand hygiene is crucial in preventing infections among hospital patients. However, for health care workers there are a lot of protocols and rules to follow. Not surprisingly, this can be quite a challenge. Studies show many health care workers only comply to this protocol roughly half of the time.

We know quite a lot about the biases that prevent health care workers from complying with the hand hygiene protocol. For one, they fall prone to optimism bias: they underestimate the risks associated with not cleaning their hands, which leads to lower hand hygiene. Also, people’s innate sense for autonomy can reduce compliance: showing how a protocol is useful has been shown to be more effective than rigidly enforcing it.

Building on this work, we identified in a prestudy two primary behavioral barriers specific to the hospital nurses: the negative perceptions of the hand hygiene protocol as burdensome, and a lack of understanding the consequences of noncompliance.

To improve hand hygiene compliance, we developed a nudge and a boost that were adapted to these specific behavioral challenges. Both interventions were distributed primarily via new posters that were hung in 10 highly visible locations in the hospital wards and flyers distributed in the break rooms. Supervisors notified nurses via email that there’d be a focus on the hospital’s hand hygiene protocol.

For the nudge, we targeted the perceived burdensome and complicated nature of the hand hygiene protocol. Using the tagline “in good hands,” we reframed the message of hand hygiene from the negative frame as an extra burden to a positive frame for the nurses: hand hygiene as a moment of care for the patient.

For the boost, we targeted the fact that nurses misunderstood the risks of not complying with the protocol. We used a risk literacy boost to try to boost nurses’ decision-making. The new protocol posters used the tagline “prevent infections” and presented facts about infection risks. We used frequencies (“1 in every 20 hospital patients receives a hospital infection”) rather than probabilities (5 percent of hospital patients), as people are shown to estimate risks better when presented in frequencies.

Posters used in the hand hygiene intervention. On the left, the nudge, which reads: “In good hands,” with the caption “Good care for your patient starts with clean hands. With below hand hygiene moments you contribute to this.” On the right, the boost, which reads: “Prevent infections” with the caption “Good hand hygiene prevents infections. With below hand hygiene moments you contribute to this.” The five icons reflects the five hand hygiene moments as defined by the WHO: before touching a patient, before clean/aseptic procedures, after body fluid exposure/risk, after touching a patient, and after touching patient surroundings.

We selected three hospital wards to receive one of two treatments or be the control group (each ward’s condition decided randomly). To assess compliance, we followed a validated observational method, observing handwashing at certain time periods. We measured protocol compliance before and during the intervention, and a week later, after the intervention was removed.

In two pretests, hand hygiene compliance ranged between 44 and 57 percent in the three wards. After the intervention, our results show that the nudge initially increased compliance to almost 90 percent but wore off after intervention removal (decreasing to 75 percent). The risk literacy boost led to a compliance of 77 percent which remained stable until at least one week and even after intervention removal (80 percent). Compliance in the control ward was between 55 and 61 percent in the posttests.

The informational flyer that accompanied the boost intervention. It reads: “Did you know?” and then: “One in every twenty patients receive a hospital-induced infection.”; “Research shows that in two American hospitals, the number of cases with MRSA (Methicillin-resistant Staphylococcus aureus) infections decreased by half after healthcare employees improved their hand hygiene.”; “Good hand hygiene of healthcare workers is the most effective way to prevent infections.”

Our study indicates both nudges and boosts are effective behavioral interventions to improve hand hygiene. We find evidence for a stronger direct effect of the nudge, and preliminary evidence for a longer lasting effect of the boost. These findings are intriguing, yet we should emphasize that our experiment was fairly small. The timeline and scale we maintained is common in these types of experiments, but studies with more observations and measurement periods may provide more conclusive evidence.

Also, while in theory the differences between nudges and boosts are clear, in practice both interventions may have had side effects that resembled the other intervention. For example, the boost may have included nudge-like mechanisms. If a participant reacted solely to the tagline “prevent infections” and did not learn from the information on risks, they may have been nudged rather than boosted. Additionally, we designed a boost that did not require much effort from the nurses. This could have contributed to the positive impact it had on compliance rates; in the busy work environment, nurses may not have had much time to read or process the boost. This exemplifies that field experiments offer more realistic but also messier tests.

Our study indicates both nudges and boosts are effective behavioral interventions to improve hand hygiene. We find evidence for a stronger direct effect of the nudge, and preliminary evidence for a longer lasting effect of the boost.

We want to close with two observations based on our experience conducting this study and our outlook as behavioral scientists. First, we should be conscious to keep innovating behavioral interventions rather than only repeating classics. For example, how can we go beyond targeting biases and include the recent insights on noise in the toolkit? How can we attempt to intervene more effectively and precisely, for instance through personalization?

Second, if we employ the behavioral change toolkit primarily to stimulate compliance with rules and regulations, we are missing out on an urgent opportunity. Employee burnout is common in health care. At the moment, we are working to see whether behavioral change interventions may help alleviate work pressure and increase occupational well-being. In our view, the promise of developing and testing behavioral interventions is not in providing universal knowledge, but the development of a toolkit that offers solutions for as many situations as possible.