"I don't understand you, therefore you are wrong"
I call this "apprehension bias" and you need to know about it

In 1912, Alfred Wegener proposed his “continental drift” theory, the precursor to tectonic plate theory. He was ridiculed and dismissed until half a century later, when the discovery of undersea rifts vindicated his theories.
This leads to the saying:
Being right too early is the same as being wrong
As a communicator, researcher, and public intellectual, I encounter this all the time. Not everything I say or do is correct, but most interestingly, when I was “right too early” people tend to gloss over it, saying “yeah but we didn’t have evidence” or they just forget that I said it originally, and then once the Overton Window catches up, they assume it was the default view all along.
My early work on cognitive architectures for language models was a prime example of this. I was mostly ignored, sometimes ridiculed, but some of my work was foundational on the memory systems that chatbots use today. Fortunately, I have GitHub repos with timestamps to prove what I said, when I said it.
However, even when discussing things that are not pioneering or cutting edge, there’s a sort of cognitive bias that I’ve encountered, that has yet to be named. The reaction that many people have to new ideas (and the messengers that bear them) seems to be “I don’t understand you, therefore you are wrong.”
Now I have a name for this: apprehension bias.
To put it in more clinical terms, we can define it thus:
Apprehension bias is the mental shortcut where one equates intelligibility with truth and unintelligibility with falsehood. It’s a metacognitive miscalibration: confusing my ability to understand with the actual validity of the claim.
From a neurological standpoint, it makes sense. Our brains are, at least partly, pattern matching machines. If a new piece of information arrives that doesn’t neatly map onto pre-existing patterns and knowledge, it throws an exception flag.
My wife, as a librarian, calls this the “exception bin problem”—any categorization system generally has a few neat boxes to slot things into. Many public libraries have several such bins for missing and misplaced items:
Books that are to-be-shelved
Books that are damaged
Books that were left that aren’t ours
Children’s toys that were left
Coats and umbrellas
That sort of thing. But, in her career as a librarian, she once had to deal with a rather unique object: a giraffe onesee.

The negative side of apprehension bias can be represented by the giraffe onesee problem. Essentially; “I don’t have a neat way of explaining, understanding, or classifying this… so I just throw it away.” In the case of many public libraries, it might end up in the general “exceptions” bin of lost-and-found for a while, or it might go straight into the dumpster.
This apprehension bias is similar to other cognitive biases and logical failures, such as the availability heuristic (which says that recent or vivid information is more likely to be applied, inappropriately, to novel scenarios). In other words, if you just watched Terminator 2, then AI doomsday scenarios feel more plausible. There are a few other cases that are similar-but-different from what I’m talking about. So let’s unpack them for clarity’s sake:
Argument from Incredulity. This is a logical fallacy: “I can’t imagine how this could be true, therefore it must be false.” It matches the sentiment of “I don’t understand you, so you must be wrong.”
Illusion of Explanatory Depth. People often believe they understand something until they are asked to explain it in detail. When they cannot, they discount the other party’s explanation as confused or incorrect, even when the problem lies in their own limited grasp.
Naïve Realism. The belief that one’s own perception is the most accurate representation of reality. From this stance, if someone says something that seems unintelligible, the reflex is to treat it as wrong rather than consider the limits of one’s understanding.
The Dunning–Kruger Effect. While usually framed as the incompetent overestimating themselves, the inverse also appears: highly competent individuals may be underestimated because their reasoning exceeds what the average person can easily follow. This dynamic creates the “you’re either a moron or too far ahead” dilemma (since many people don’t understand you).
Epistemic Injustice. Philosopher Miranda Fricker describes how people are wronged specifically in their capacity as knowers. If a hearer dismisses a speaker’s testimony because it is not easily intelligible, that is a form of testimonial injustice—discounting knowledge claims unfairly.
To put it even simpler: Apprehension bias is the preference for that which is most intuitively grasped to us.
The term maps cleanly to the literature on legibility and sense‑making while keeping the focus on the listener’s cognition rather than the speaker’s clarity. It highlights that illegibility is often local and provisional—what is opaque to one mind at one moment can be perfectly clear to another or to the same mind after scaffolding—so treating “opacity as disproof” is a bias.
It can also be a logical fallacy:
When apprehension bias is put forward as a reason in argument, this shows up as the argument from personal incredulity: “I cannot see how this could be true; therefore it is false.” It overlaps with, but is distinct from, the broader argument from ignorance (“not proven true, therefore false”).
Key Takeaways
Having wrangled with this problem as a public communicator and thinker, here are a few strategies that can help:
Speak to the Lowest Common Denominator: It’s not possible to head off every argument at the pass, however it’s worth trying. This can be done by couching language tentatively, as a thought experiment, or personal viewpoint, rather than as a hard fact. Even if you know something to be empirically true, and have done the research, the presentation of a fact (that would pass muster for an educated audience) is often enough to trigger the “epistemic immune system” of lay audiences. It’s sort of like “leading the horses to the water”—but drinking it has to be their idea.
Being Right Too Early Is The Same As Being Wrong: This, seemingly, is just a fact of life. In my personal experience, several people that really mattered saw the value of my early work, such as top level engineers at Microsoft and Google. However, they were a slim minority of my audience, and because it took several years for the rest of the world to catch up, I was somewhat discouraged to continue my frontier cognitive architecture work. Especially as I had published everything open source, and never really received credit for it (except in some comments and feedback from people saying that the implemented my work and it was game-changing for them). Without that feedback, it was difficult to justify continuing.
People Always Project Their Limitations First: Whenever people say “that’s not possible” or “that’s not how things work” I have generally learned to interpret this as personal projection. I always reframe this as them saying “that is not possible for me” and “I don’t know how that works.” To that end, I came up with the mantra of “never define yourself by other people’s limits.” They cannot apprehend your thought process, therefore it’s not worth trying to explain it to them. This probably comes across as very haughty and superior, but it’s just a fact of life: different people have different levels of education, intelligence, and other skills. For many reasons, it is politically unpopular to point this fact out.
Plenty of sharp folks have pointed this out many times, namely Isaac Asimov:
“There is a cult of ignorance in the United States, and there has always been. The strain of anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that ‘my ignorance is just as good as your knowledge.”
There you have it, now you know what apprehension bias is.


yeah encountered this many times, being too early is indeed the same as being wrong. it was nice to get naïve realism definition categorized, feels like many people smart or dumb work like that.
I wonder if laughing out loud is a typical reaction to reading your brilliant account of the human comedy you so insightfully describe. All I know is that's what I just did.
You perfectly capture my professional and personal experiences over my four score and ten.
Thanks for the diagnosis and also the therapeutic response. It is indeed difficult to persevere when you see things that others don't, and vindication by reality many years down the road is cold comfort for the visionary. Your essay helps.
Even better, you offer some practical techniques for overcoming the challenge. The first step as in all problems is to understand the psychology of the interlocutor. Your analysis is essential in that regard.