How Do We Know?

A few sizable credit losses of late have created headlines and caused ripples of concern, although spreads remain tight and covenants loose, so worries aren’t widespread.  At least not yet.

There has been talk of cockroaches — when will the next unpleasant example show up? — and participants in different parts of the credit markets are claiming that the problems can be found elsewhere, certainly not in their house.

Given the huge increase in assets in fairly new structures and strategies (and the eternal tendency to press bets that are working), it would not be a surprise to see an infestation of problem positions, but we’ll leave the prognostication about the likely outcomes to others.

Our concern relates to the quality of due diligence, not just when trouble reveals itself, but on an ongoing basis.  And to the discipline required to stick to a standard of quality as the norms of market behavior wax and wane.

Fact checking

An entertaining and informative New Yorker article by Zach Helfand, “The History of the New Yorker’s Vaunted Fact-Checking Department” (which appeared in print with the simpler headline “Vaunted”), provides a ready analogy:

It’s difficult to check facts, or to talk about fact checking, without coming off as a know-it-all, a fussbudget, or a snob.  But knowing things is hard.  Checking is a practice.  It’s not omniscience.

So,

How do you confirm a fact?  You ask, over and over, “How do we know?”

There is no formula (and lots of ambiguity):

Sometimes one source is enough.  Sometimes ten aren’t.  Checking is a forced humility.  The longer you check, the more you doubt what you think you know.

Investment organizations generally don’t have a fact-checking ethos or the structure to support one.  Most research is the product of an individual who puts forth an idea for others to act upon and is thereby accountable for it if they do.

As with authors, investment decision makers prize their autonomy and can bristle at the oversight of others, wanting their work to stand on its own as conceived.  Doubts about the process behind an idea can mar the story being told, erode confidence in it, and open up questions about what is known and unknown.

The New Yorker article references the “checker’s paradox”:

The more you know, the more you know that there is more you don’t know.

No one likes to admit that.

Standards

History shows that due diligence practices shift in response to the environment.  A tailwind of easy money begets looser standards until the sloppiness that has crept into the process results in big problems.  Then never-again promises are made, only to be broken during the next cycle.

Actually, including the word “standards” in that description is misleading, since its use should be reserved to indicate absolute expectations and actions, not relative ones.  Standards that slide around aren’t really standards.

In structuring a due diligence process, a certain amount of specification should come into play — we do this, we don’t do that — but overdoing it can lead to a check-the-box notion of completeness, which is not what due diligence should be about.

Instead, it is the quality of the effort that matters.  Does it measure up?  The New Yorker is “vaunted” in regards to its fact checking because of its long-standing obsessiveness about the process and the tales and examples passed from generation to generation that attest to it.

While an assigned fact checker carries the bulk of the load on a story, many others have eyes on the piece, including other fact checkers who may be consulted, the head of the department, authors, editors, and the editor-in-chief.  It is everyone’s business to get things right.  Learning about the “how” of the checking that has been done is to be expected.

Contrast that with recommendations that stem from investment due diligence work.  It is rare to see much if anything in a written recommendation about the diligence process that had been employed — and presentations and committee meetings focus on the conclusions offered before quickly moving to an exchange of opinions between those proposing an idea and those deciding its merits.  The evidence as presented gets the attention, not the methods behind its acquisition.

Ultimately, the recommendations — and the people who make them — are judged on outcome not process.  And, as long as things go well, they are left to their own devices (and compensated for their calls).  It is no wonder standards slide.

Another contrast between the New Yorker milieu and the investment world comes with how outsiders judge the effort.  As the author writes, “People like finding errors in the magazine, probably because the magazine is so smug about its fact checking.”  To that point, in a subsequent issue a letter to the editor pointed out a factual error in the article on fact checking.  There are always readers reviewing the work and nitpicking.

Compare that to the ultimate users of a due diligence recommendation (consultants, asset owners, advisors, whomever), who mostly want to get the bottom line and don’t ask much about methods.  They are primarily interested in performance and predictions, and don’t seem to care how they come about.

Weak links

The investment ecosystem is made up of a vast network of differentiated agents.  For the implementation of any particular investment program there is a chain of those agents that connects the owner of an asset to the organization and individuals that invest it.

They say that a chain is only as strong as its weakest link and yet the quality of the due diligence and analysis in investment chains is usually obscured or unexamined.  It is — unfortunately and dangerously — often assumed to be better than it actually is.

To illustrate the point, just sketch the investment chains for some of the credit losses that are making headlines, note where the diligence failures have occurred, consider the work that could have been done to prevent the calamities, and ask why it didn’t occur.  Then think about how much those chains are reliant on assumptions about due diligence being rigorous rather than checks to find out whether it actually is.

We can’t know everything.  There are boundaries and barriers and, inevitably, gaps of knowledge which even the most diligent of analysts won’t be able to fill.  That doesn’t mean that all is lost; uncertainty comes with the territory.  But judging how to deal with those gaps should involve mapping the landscape of our understanding.

Not nearly enough time is spent detailing what we don’t know.

What we think we know gets most of the attention and drives the narratives that lead to decision making.

The overlooked question is:  “How do we know (what we think we know)?”

Excellence in due diligence comes from living that question day after day.

 

This theme will continue with the next essay, which will explore the reality that a great deal of what we think we know we have only learned from what other people have told us.  If you’d like to get postings via email as they are published, you may subscribe here.

Published: November 13, 2025

Subscribe

To comment, please send an email to editor@investmentecosystem.com. Comments are for the editor and are not viewable by readers.