The Nudge That Didn’t
Why “conflict disclosures” don’t work
It’s hard to think of a more successful recent policy innovation than the nudge. The simple idea is to create policy that’s “built for people.” For too long, it felt like policy interventions were built for every group except the people. Laws were hard to read; they imagined that humans decision-making is completely rational; regulation was overly formal, rigid, and anything but intuitive. Today, for every law that still adopts this traditional approach to policymaking, there is a nudge to challenge it.
I’m a huge fan of nudges. I think the core values that they represent is the future of effective policymaking. In fact, I think we’re only at the very beginning of policymaking that’s “built for people.” Eventually, all successful policy will meet people where they are, instead of where policymakers imagine them to be.
That said, I’m increasingly bearish on one category of nudging: the use of information disclosures to combat conflicts of interest. These generally involve an advisor (or expert) informing an advisee (or novice) of some financial or personal interest that might bias the advice or information being conveyed.
For example, many top tier universities require faculty to disclose any “outside interests”:
As a researcher at the University of Michigan, you are expected to disclose any financial or management interest in an outside company or other entity as it relates to your employment. Your disclosure ensures that you and U-M are compliant with the federal and state regulations designed to safeguard objectivity in research.
At Harvard Law, faculty pages include a disclosure stating any “related outside interests and activities.” Some faculty have none; others have many. John Coates, for example, lists about 30 outside financial interests.

The theory behind these disclosures is that students, scholars, and the public will use these disclosures to better scrutinize researchers’ work. If Coates, for example, writes a law review piece about “large public companies,” (the first item on his list) we should keep that it mind as we read the piece.
Disclosure is often described as “behaviorally sensitive” intervention. It’s supposed to meet people where they are. But my sense is that disclosure rarely does so. In fact, it might not only fail to work as intended. Disclosure often appears to have the opposite effect. More disclosure makes an expert seem more credible.
Want evidence? OK. Compare Coates’ lengthy list of disclosures with this:

No activities to report. In theory, this should make Goldberg somehow more credible. But at a gut level, who seems more credible to you — Coates (30+ disclosures) or Goldberg (zero disclosures)?
You can do this experiment on Mechanical Turk in an hour. Redact the names, and ask participants which scholar seems more credible.
My bet is that a majority of folks would say Coates.
If I’m right about this — that more disclosures makes you seem more credible — then we have probably been misusing the tool of disclosure.
Since nudges came to prominence, disclosure has been treated as a seminal intervention. In Sunstein and Thaler’s Nudge, disclosure is given pride of place. Globally, “nudge units” describe disclosure as a key tool in the nudge toolbox.
But if Coates’ list makes him seem more credible then Goldberg’s list, all of this is wrong. Conflict disclosure isn’t a “behaviorally sensitive” intervention at all. It doesn’t meet people where they are. It assumes a type of rationality we don’t possess. And in that way, it’s no better than laws that assume we’re rational automatons.
A recent New York Times piece provides further evidence that conflict disclosures often fail. There, Sunita Sah explains that conflict disclosures can increase pressure on the advisee to accept the advisor’s recommendation, because failing to do so signals distrust. In other contexts, such as a radiation oncologist’s advice to treat a medical issue via radiation, the surgeon may over-sell the advice in order to wash away the effect of the conflict disclosure.
All of these discrete effects can be summarized thusly: Conflict disclosure advocates assume the informed parties will process information in a certain way. In reality, they often process it differently.
And this is just about the definition of an anti-nudge.