Is abstraction to blame?

Let’s make some assumptions. Let’s assume that, at the outset, there are no genetic factors significant enough to account for one entire group’s remaining connected to its environment and another choosing disconnection. Let’s assume that individuals (and groups) will seek advantage where they can find (or create) it. Let’s assume that Dunbar’s number is a hard limit (~150 people). Scale beyond that demands abstraction. Let’s assume “worldviews” emerge to maintain cohesion of groups beyond 150 people. Let’s assume worldviews exist on a spectrum of fidelity to the world…some more grounded, others more distorted. And let’s assume that collapse risk increases as worldview diverges from world…an inverse correlation between realism and resilience. Let’s do our best to let go of our “civilization vs. tribe” bias and see the whole thing as feedback fidelity across scale.

At ~150 individuals, a group’s relational coherence (previously maintained by direct sensory, ecological, and social feedback…fragments…prehistoric keyboard warriors appear). Shared stories start to replace shared experience. Symbols replace presence. And roles, laws, and systems emerge as prosthetics for lost immediacy. Now we have a fork: fidelity vs. simulation.

The group with the high-fidelity worldview uses myth, ritual, and language to model the world as closely as possible. Symbols are tethered to reality, authority is distributed (and accountable to ecology and relational norms), growth is still limited by feedback and encoded in story, and abstraction is used with care and periodically re-grounded (e.g. vision quests, initiation, seasonal rituals). These are stories that serve to remind the group of how the world works.

This group persists. Its worldview preserves adaptive behavior even at scale. They may never become “civilizations” in the classic sense, because they resist the abstraction that enables runaway scale.

The group with the low-fidelity worldview uses abstraction to model desire, not the world. Symbols become detached from feedback…power, wealth, status grow by internal logic. Authority is centralized and increasingly self-referential. Growth is pursued independent of ecological context. And simulation becomes self-sustaining…a loop that no longer checks against the world. These are stories that tell the group it’s right, even when the world says otherwise.

This group expands faster, but at the cost of delayed collapse (feedback). The tighter the internal simulation, the longer it can suppress reality…until reality returns with interest.

And so this gives us a nice, simple predictive model: collapse is the repayment of feedback deferred by low-fidelity worldview. The greater the distortion, the greater the build-up, the harder the crash. You could almost graph it. Fidelity to reality on the X-axis and time to collapse on the Y-axis. And you’d see an inverse exponential curve.

This model has falsifiable (testable) implications.

If accurate, you should see that high-fidelity groups maintaining ecological balance over time, resisting large-scale empire formation, embedding taboos, rituals, and stories that enforce ecological or social limits, and being harder to conquer ideologically, but easier to conquer militarily. And we do see that, don’t we?

If accurate, you should see that low-fidelity groups expanding rapidly and dominating others, delaying collapse through buffering, abstraction, and extraction, pathologizing feedback-sensitive individuals, and experiencing sudden systemic failure. And we see that as well, don’t we?

If accurate, collapse events will often mark the point where simulation becomes completely unmoored from reality, and the return of feedback becomes catastrophic rather than adaptive. And this is exactly what we see in the dramatic phenomenon we call “the collapse of a great civilization,” as well as collapse events we feel around us every day in our own spectacularly unmoored simulation.

What we arrive at isn’t just a description of how civilizations fall…it’s a redefinition of what scale itself demands. Scale isn’t the problem. The problem is simulation without feedback.

Collapse isn’t inevitable because of size. It’s inevitable when scale is managed through simulation that suppresses reality. So the real challenge isn’t to reject abstraction (that’s here to stay)…it’s to embed continuous feedback into abstract systems. Otherwise, they’re on a one-way street to delusion.

I can’t emphasize this enough: collapse isn’t moral or technological failure. It’s a delayed feedback event. It’s about worldview fidelity. Does a symbolic order track reality, or replace it?

Let’s make some assumptions. Let’s assume that, at the outset, there are no genetic factors significant enough to account for one entire group’s remaining connected to its environment and another choosing disconnection. Let’s assume that individuals (and groups) will seek advantage where they can find (or create) it. Let’s assume that Dunbar’s number is a hard limit (~150 people). Scale beyond that demands abstraction. Let’s assume “worldviews” emerge to maintain cohesion of groups beyond 150 people. Let’s assume worldviews exist on a spectrum of fidelity to the world…some more grounded, others more distorted. And let’s assume that collapse risk increases as worldview diverges from world…an inverse correlation between realism and resilience. Let’s do our best to let go of our “civilization vs. tribe” bias and see the whole thing as feedback fidelity across scale.

At ~150 individuals, a group’s relational coherence (previously maintained by direct sensory, ecological, and social feedback…fragments…prehistoric keyboard warriors appear). Shared stories start to replace shared experience. Symbols replace presence. And roles, laws, and systems emerge as prosthetics for lost immediacy. Now we have a fork: fidelity vs. simulation.

The group with the high-fidelity worldview uses myth, ritual, and language to model the world as closely as possible. Symbols are tethered to reality, authority is distributed (and accountable to ecology and relational norms), growth is still limited by feedback and encoded in story, and abstraction is used with care and periodically re-grounded (e.g. vision quests, initiation, seasonal rituals). These are stories that serve to remind the group of how the world works.

This group persists. Its worldview preserves adaptive behavior even at scale. They may never become “civilizations” in the classic sense, because they resist the abstraction that enables runaway scale.

The group with the low-fidelity worldview uses abstraction to model desire, not the world. Symbols become detached from feedback…power, wealth, status grow by internal logic. Authority is centralized and increasingly self-referential. Growth is pursued independent of ecological context. And simulation becomes self-sustaining…a loop that no longer checks against the world. These are stories that tell the group it’s right, even when the world says otherwise.

This group expands faster, but at the cost of delayed collapse (feedback). The tighter the internal simulation, the longer it can suppress reality…until reality returns with interest.

And so this gives us a nice, simple predictive model: collapse is the repayment of feedback deferred by low-fidelity worldview. The greater the distortion, the greater the build-up, the harder the crash. You could almost graph it. Fidelity to reality on the X-axis and time to collapse on the Y-axis. And you’d see an inverse exponential curve.

This model has falsifiable (testable) implications.

If accurate, you should see that high-fidelity groups maintaining ecological balance over time, resisting large-scale empire formation, embedding taboos, rituals, and stories that enforce ecological or social limits, and being harder to conquer ideologically, but easier to conquer militarily. And we do see that, don’t we?

If accurate, you should see that low-fidelity groups expanding rapidly and dominating others, delaying collapse through buffering, abstraction, and extraction, pathologizing feedback-sensitive individuals, and experiencing sudden systemic failure. And we see that as well, don’t we?

If accurate, collapse events will often mark the point where simulation becomes completely unmoored from reality, and the return of feedback becomes catastrophic rather than adaptive. And this is exactly what we see in the dramatic phenomenon we call “the collapse of a great civilization,” as well as collapse events we feel around us every day in our own spectacularly unmoored simulation.

What we arrive at isn’t just a description of how civilizations fall…it’s a redefinition of what scale itself demands. Scale isn’t the problem. The problem is simulation without feedback.

Collapse isn’t inevitable because of size. It’s inevitable when scale is managed through simulation that suppresses reality. So the real challenge isn’t to reject abstraction (that’s here to stay)…it’s to embed continuous feedback into abstract systems. Otherwise, they’re on a one-way street to delusion.

I can’t emphasize this enough: collapse isn’t moral or technological failure. It’s a delayed feedback event. It’s about worldview fidelity. Does a symbolic order track reality, or replace it?

Comments

Leave a comment