Maybe.
Any group that seeks advantage needs a model of the world to interpret cause and effect. This is true post-Dunbar (when a group is made up of more than ~150 people). Once behavior depends on symbol, the group is no longer responding to the world directly, but to its model of the world (this is consistent with the predictive processing model of human behavior). So what matters now is model error (and what happens to it)…not truth.
Do you treat predictive error as signal, or noise? This is the fracture.
One group encounters contradiction, failure, discomfort, and says, “We misunderstood something.” They adjust their model.
Another group encounters the same and says, “This isn’t a real error.” Their model is preserved and signal is suppressed. Then the compounding begins.
Every time the world returns unexpected feedback and you refuse to update, you embed the error into the structure. You reframe the contradiction as a test, or anomaly, or enemy action (think Trump). You revise the interpretation of feedback, not the model itself. You build insulation layers to protect the model from reality.
Each move makes the model more coherent internally, but less aligned with the world. The simulation becomes smoother and the fit becomes worse. And because each act of suppression makes the model harder to question next time, the cost of correction increases exponentially.
What’s being “compounded,” exactly? Error, because each misfit is hidden rather than corrected. Confidence, because the model appears to keep “working” internally. Power, because the system selects for those who uphold the model. And fragility, because the longer the feedback is ignored, the harsher its return.
This is how collapse becomes inevitable, not from evil or chaos, but from a feedback loop about feedback itself.
Collapse begins the first time a group decides that a predictive error is not worth adjusting for. The cause is this treatment of error, and the decision to protect the model rather than let it break where it no longer fits.
A man dances. It rains. It happens again. And again.
He (and eventually the group) builds a model: “Dancing causes rain.”
So far, this is rational…based on a perceived pattern. This is just pattern sensitivity, not delusion. Everyone does this. Animals do it too. The brain is a pattern detector, not a truth detector. No problem yet.
Others begin to believe. The dancer is now “Rainbringer.” His status rises and the ritual becomes culturally encoded. It’s a model with structure. It’s a social artifact now, not just a belief. And still no collapse. This can all exist within feedback sensitivity if error remains possible to acknowledge.
He dances and it doesn’t rain. Or it rains with no dance. The group now faces a contradiction between model (dance = rain) and feedback (it didn’t work). This is the first point of model failure, and it opens two paths.
If the group treats the error as a signal, it says, “Hmm. Maybe the connection wasn’t causal. Maybe dancing helps, but doesn’t guarantee it. Maybe something else matters too…clouds, season, soil. Maybe we were wrong. The model updates. Maybe the ritual stays as a tradition, but it loses its literal power claim. Now the worldview remains tethered to feedback.
If the group treats the error as noise, it says, “He mustn’t have danced correctly. Someone in the group was impure. The gods are angry about something else. Rain did come, it’s just coming later. Don’t question the Rainbringer.” The model is preserved. But now, additional structures must be created to explain away the contradiction. And those structures will have their own failures, requiring even more insulation. This is compounding error in action. The model survives at the cost of truth.
So the arc has a curvature. In the first path, the model continues to reflect the world, even if imperfectly. In the second path, the model begins to simulate reality, and each new contradiction deepens the simulation.
Eventually, rain becomes something that doesn’t just happen…it becomes something that has to be narrated. And the system becomes a feedback-sealed loop. Until the drought is too long, belief no longer sustains coherence, and collapse forces the signal through.
The divergence between sustainable worldview and collapsing worldview is not belief itself. It’s how the group responds when the pattern breaks.
But why does one group treat error as signal, and another as noise? What’s the difference between the two?
Is it in the quality of a group’s pattern detection? Maybe. But both groups saw a pattern where one didn’t exist. That’s normal…it’s how learning starts. So pattern detection alone doesn’t explain the difference. It might influence the likelihood of correction, but not the structural response to error. Everyone sees false patterns, but not everyone protects them.
Is it how long the pattern appears to work? Maybe. The longer a pattern appears to be true, the higher the social and symbolic cost of abandoning it. If the rain-dancer’s model “works” for 20 years before failing, the group’s going to have a hell of a time letting go of it. It’s now embedded in ritual, hierarchy, identity, morality, and possibly even infrastructure. So when error comes, it’s no longer a mere contradiction, but a threat to the entire structure. The longer false coherence holds, the more catastrophic its loss becomes. Still, this is a compounding factor, not the root cause.
Is it a group’s tolerance for uncertainty? This feels closer. Some groups may be more willing to live inside ambiguity…to say, “Maybe we don’t know.” Others require certainty, especially when power, identity, or survival are at stake. When uncertainty is seen as dangerous, contradiction is repressed. But even this is downstream of a deeper variable.
So what’s the root difference?
I’d say it has something to do with the group’s willingness to let its model break. In other words, a group’s relationship to truth. Some sort of functional truth orientation…a cultural posture that says: “Our model exists to serve reality, not the other way around. We are allowed to be wrong. The map is not the territory.”
Groups that survive over time have ritualized humility at the model level. They embed model-breakability into the structure and build a bit of slack around belief. Maybe collapse becomes inevitable when belief becomes non-negotiable. When a group treats its model as the world itself instead of something that’s subordinate to the world.
And none of that word salad comes even close to satisfying me. I still can’t locate the inherent difference in people that would explain why a group would choose fictions over reality…fictions that lead to destruction.
Even when we level the playing field…no genetic difference, a shared environment, same cognitive equipment, same feedback events…one groups loosens its grip when the model breaks, and the other tightens that. It feels like a difference that came from nowhere, and my brain doesn’t tolerate that well. I want a mechanism.
I’m not willing to say, “Some people are just wiser.” Or, “Some cultures are born better.” And definitely not, “Some mystical essence preserved them.” It’s lazy and just names the difference instead of explaining it. And it’s not agriculture. Or symbolic thought. Or state-formation. Or a very precise list of environmental conditions at a very precise time. I’ve thought these through for months, and I just don’t see it.
Maybe the difference isn’t in the people, but in the first error and how it interacts with attention.
Let’s go back to the dancer.
Two groups experience the same failed rain-dance. The only difference is in one group, someone notices and the group listens. In the other group, the same doubt arises…but it’s silenced, ignored, or never spoken. The system begins to shape attention instead of truth. Maybe.
If this were true, we could say that the divergence doesn’t begin with different kinds of people. It begins with different positions within the social system…or different degrees of attentional slack. Small variations in who’s allowed to speak, who’s believed, how disagreement is treated, and how closely someone is still tracking the world (hunters, children, women, outsiders) can determine whether the group detects error when it first appears. Maybe it’s the structure that lets signal in (or doesn’t).
But I don’t buy it. I think it comes close (it does have something to do with WHO is listened to)…but the structural argument feels too top-heavy. Too contrived. It’s something about the people. It has to be.
And I keep coming back to that silly rain dance example.
“Oh, he moved his left leg differently last time. The dance is off this time. That must be why the rain isn’t coming.” Is this where it begins? With compounding error? A first act of model preservation over model revision?
It’s like an inoculation against contradiction. The dancer failed to bring rain, and instead of letting the model break, the group makes a seemingly reasonable micro-adjustment that preserves its frame. But it proves to be anything but reasonable. It’s the beginning of something else entirely.
Because it says, “The model is still valid. The error lies in execution…not in assumption.” I think that distinction is everything. Because once you decide the model must be true, every contradiction becomes a problem to explain away, not learn from. You start adjusting the dancer’s position, the timing, the offerings, the purity of the audience, the phase of the moon, the moral status of dissenters. Each change adds complexity without re-examining the core claim…each layer distances you further from reality and makes it harder to walk back.
The “left-leg hypothesis” might feel like a natural progression of curiosity…but I don’t think it is. Because it isn’t asking, “What’s true?” It’s asking, “How can we keep the model intact?” And that’s compounding error in its earliest, most innocent form. It starts as protective curiosity, evolves into explanatory gymnastics, and ends in systemic delusion. In constantly mowing 40,000,000 acres of grass for no sane reason.
It’s a wall that begins…error becomes a problem to solve inside the model, a threat to those who challenge it, and a signal no longer heard. And eventually you’re living in reference only to the model (the dance, the roles, the rituals, the scapegoats) while the sky goes dry. “He moved his leg wrong. And so began the drought.”
Leave a comment