Tag: consciousness

  • Tribalism, Consensus Reality, and Domestication

    I think it’s safe to say that feedback-sensitive (neurodivergent) people are less susceptible to tribalism. Under most circumstances (I’ll get to the exceptions), we’re less likely to be a conservative, a democrat, a fundamentalist, etc.

    Tribalism (the tendency to take your group’s beliefs as your own and enter into conflict with other groups) depends on high-precision social priors…shared norms, loyalty cues, in-group/out-group boundaries. But we rely more on direct sensory evidence or logical consistency than on these socially constructed signals. The pull of group identity isn’t as strong for us.

    I don’t experience the same automatic emotional reward for aligning with group opinion. I don’t get that warm and fuzzy feeling called “patriotism,” for example. I’ve never understood it. If an in-group belief contradicts observed reality, I can’t help but question it…even if it costs social standing. And tribal systems clearly punish that.

    Tribalism thrives on broad, simplified narratives (“they’re all like that”), which smooth over (or blind people to) exceptions. But exceptions are what catch my attention the most. Outliers that break the spell of group generalizations stand out to me, and I’m constantly stupefied that this isn’t the case for most people.

    Neurodivergents aren’t always resistant to tribalism, of course. In environments where belonging feels existentially necessary (which is just about every fucking environment in 2025), we can certainly conform strongly…overcompensate even.

    But I’d argue that for most of human history, where the “opt-out” option was real, if the group became oppressive, coercive, or incompatible with our temperament, we simply….left. And that escape valve probably served as a check on group conformity and control.

    We’d leave for a number of reasons. If group norms are arbitrary or contradictory, sticking around creates constant prediction error. As hard as it is, departure is often the path of least resistance for someone like me, and probably would have been for people like me in the past. We were also probably self-reliant in certain domains. Many autistic skill sets (deep knowledge in specific areas, tracking environmental patterns) would translate to survivability outside rigid social structures. And a drive for integrity over belonging means that physical separation would have been (and still is) preferable to constant self-betrayal.

    Wherever civilization spreads, the option to leave disappears. Agricultural and industrial societies lock people into fixed territories, usually controlled by central authorities. Survival becomes tied to participating in a single dominant system (currency, markets, legal structures), removing just about every parallel option. And instead of many small groups to choose from, there’s effectively one “tribe” (the society and its cultural apparatus). The option to “simply leave” is gone now, I’d say.

    Without the option to leave, those of us who would naturally walk away face a stark choice…either overcompensate (learn to mimic, mask, and fit despite the cost) or withstand isolation (remain noncompliant and absorb the social/economic consequences). I think this goes a long way in explaining why in the modern era autistic burnout and mental health crises are more visible (Breaking News: Autism Epidemic!!). The evolutionary safety valve is….gone.

    I’m not wired for tribalism. It looks ridiculous to me. I hate that people have this sort of neediness for it. Especially when, in 2025, we recognize it as the greatest barrier to humanity effectively addressing global crises…from planet destruction to systematic inequality and democratic collapse. It creates moral elasticity, where harm is justified by group loyalty. It creates rivalries purely for identity’s sake (beating the shit out of each other over a fucking soccer match). It makes coordinated responses impossible. It’s fucking dumb.

    This is all deeply bound up with what I call consensus reality (the shared social priors/expectations that a group holds about “what is real,” how the world works,” and “what matters”). In that context, I see tribalism as the emotional and identity-binding mechanism that keeps people committed to those social expectations, defends them from contradiction, and rejects competing models from out-groups. Consensus reality (what people call “the real world”) gives tribalism content…stories, values, and assumptions the group agrees on. And tribalism, in turn, gives consensus reality teeth…social rewards for conformity and penalties for deviation. You could see it it as the social immune system that suppresses any error signal that might disrupt the shared model people consider “reality.”

    Which brings me back to a core idea of my book: every degree of separation from reality (unmediated feedback) creates space in which lies and manipulation can be used to control human behavior. Symbolic representation, bureaucracy, technology, propaganda…all stand between an action and its consequence. From a predictive processing perspective, these separations replace sensory precision with social priors. Once your perception of consequence is dominated by priors handed to you by others, your model of reality can be steered by whoever’s controlling the narrative (regardless of what’s actually happening right outside your window).

    In the human domestication frame I’m building, this explains how a control system matures. Reduce a person’s direct contact with feedback, fill the gap with consensus reality (shared fictions), and use tribalism to keep the consensus coherent and defended.

    It also explains why nothing changes. Everyone’s wondering why humanity can’t seem to course-correct, but this isn’t rocket science. Mainstream “solutions” operate entirely inside mediated spaces, accepting layers of separation from reality as normal or inevitable. They try to optimize those spaces…fact-checking information, creating better messaging, nudging behavior with persuasive campaigns. They ignore the gap itself. The underlying problem (that people’s models of reality are no longer tethered to direct, embodied, ecological feedback) is left untouched.

    In other words, people just swap one set of social priors for another, without increasing the precision of sensory input from the real world. Their brains are still mostly being updated by other people’s models instead of reality itself. That’s like improving the entertainment or fairness of the feedlot without questioning why the fuck the animals can’t simply graze in the field anymore. The control system remains intact (strengthened, maybe) because the medium of control (the gap) is preserved.

    This is why giving people access to more information isn’t solving anything. We may have created gaps between action and consequence, but evolution hasn’t removed the cognitive biases and drives that were calibrated for direct feedback. Those drives still operate, but now they need something to work on in the absence of reality’s immediacy…and that “something” becomes bullshit. Shared fictions.

    Why? Why do people need so much bullshit?

    For one, I think hard-wired biases still expect input. Traits like negativity bias, advantage-seeking, and status monitoring evolved to process real-time environmental cues. Without direct cues, they grab onto representations (bullshit) because the brain just can’t seem to tolerate informational vacuum.

    Bullshit comes in to fill prediction gaps. Without high-precision sensory input, shared social fictions are used to predict outcomes. Those fictions become the scaffolding (myths, ideologies, propaganda) that keep the model “stable” even if it drifts from reality (like when it starts baking the planet).

    Next thing you know, manipulation rides in on necessity. Because social fictions are now the only widely shared basis for action, whoever controls them effectively controls the behavior of the group. Domestication leverages this (the feed is always narrative feed…never the real field).

    The further the separation from unmediated feedback, the more elaborate the fiction has to be to sustain group coordination and suppress error signals. Fast forward to 2025, and people are acting entirely on their group’s fictions…with reality surfacing only in the form of the most immediate and extreme crises (which then get reabsorbed into new narrative).

    Where are feedback-sensitive people in this story? Where’s that autistic guy?

    Well, if your brain assigns low precision to social priors, then the fictions that fill feedback gaps for most people feel…jarring, flimsy, or outright hostile. My brain gives more weight to sensory or logical evidence, and that means constant prediction error when I interact with a model that’s running entirely on narrative rather than reality.

    In domestication terms, that makes me an outlier in a control system that depends on narrative compliance. For the “typical” person, the fiction is not only tolerable but necessary for coordination. For me, it’s a constant source of friction (because the group’s stabilizing story is exactly where I detect the misalignment most vividly).

    I’m heavy into predictive coding literature (Friston, Clark, etc.) right now, so I’ll try to frame some of my main arguments in those terms. (I’ll probably get it wrong)

    Consensus reality is….encultured hyperpriors. Culture installs hyperpriors (very high-level expectations about “how things are”). They’re learned and built up by language and institutions, and they set the precision economy (which signals get trusted).

    Human domestication is a sort of niche construction. One that rewards minds able to thrive in symbol-dense, delayed-feedback environments. The effect is a recalibration of precision…social model-based predictions gain weight and raw sensory error loses weight. This is the flattening of the error landscape, so to speak.

    Social priors are what let us coordinate at scale (which is rarely necessary…unless you’ve fucked up at scale), but trouble starts when precision sticks to bad priors in rapidly shifting or bullshit-heavy niches (media, bureaucracy), drowning out any errors that might have resulted in correction.

    An autistic person’s low tolerance for fictions is a different precision setting. It continually surfaces mismatches others smooth over. Which largely feels like shit (derealization, friction) for the autistic person, but is epistemically valuable (less “consensus-blindness,” wink wink, Peter Vermeulen).

    I’ve been talking about human domestication as selection against reactivity, but I know I have to be careful with single-trait stories like that. Maybe what’s selected are policies that minimize expected free energy in a given niche. In dense, rule-ridden societies, that means predictability-friendly (?) minds. Compliance. Delayed reward. Role fluency. Some kind of energy-efficient inference under control niches.

    This is where I’d be on my own, I think. This is the final “gap” where most of the highest-level thinkers are sort of playing…the control niche as a given. Someone like Andy Clarke (were he to agree with my line of reasoning so far), might say the solution is about tuning the system to balance priors and sensory input more adaptively. But in a domesticated, control-oriented society, “tuning” quickly becomes prescription…setting parameters so people remain useful to the system, not so they reconnect with (unmediated) reality).

    The more fundamental problem is that any centrally managed adjustment to perception keeps people inside a mediated model. It doesn’t restore autonomy…it optimizes compliance. And the last thing I want is to compliant in a system that is clearly out of touch with reality.

  • Compliance vs. Resilence (to Incoherence)

    I know the civilizing process / civilized systems select for both…but are they really the same thing? Are they both forms of attenuation (of feedback sensitivity)?

    Compliance is the willingness / tendency to conform to external demands, rules, or expectations (regardless of your internal state or of the environmental logic). It’s a behavioral adaptation. Your behavior becomes externally guided, socially enforced. You’re rewarded for obedience, predictability, and following rules. The more compliant you are, the better you’ll function in hierarchal or symbolic systems. But this adaptation (necessarily) suppresses agency, spontaneity, and moral resistance. I think of compliance as a way to survive within incoherence, by submitting to its logic…even when it contradicts reality.

    Resilience to incoherence is a bit different. I see it as the ability to tolerate cognitive dissonance, sensory overwhelm, moral contradiction, or systemic absurdity (without breaking down). Unlike compliance, this is a cognitive/emotional adaptation that’s internalized over time. You get rewarded for emotional detachment (“thick skin”), optimism, and stability. This adaptation enables prolonged function under conditions that would distress a more sensitive person. But the the adaptation (becoming resilient) suppresses emotional fidelity, sensory reactivity, and ethical boundaries. Think of it as the dampening of error signals…it allows the dysfunctional systems you participate in to keep running even when they no longer map to reality.

    They both suggest selection for high attenuation (reduced capacity to detect, register, and act on biologically meaningful feedback). That includes sensory attenuation (tolerating noise, crowds), emotional attenuation (suppressing distress, grief, anger, empathy), moral attenuation (compromising truth for harmony or success), and relational attenuation (roleplay instead of reciprocity).

    So I see attenuation as being the core functional trait being selected for in civilization. Not intelligence, strength, or adaptability, but attenuation…especially in domains that would otherwise threaten systemic continuity. That’s the falsifiable hypothesis I’m running with. That civilization (as both a process and a system) is runaway selection for attenuation.

    But attenuation is relative, isn’t it? I can’t say something is “attenuated” without specifying what signal or input has been diminished, and relative to what baseline/context.

    So in the context of domestication/civilization…what signals are being attenuated (and compared to what prior or natural baseline)? I listed some above and I keep adding more.

    Again, it’s not that “civilized” or “neurotypical” people are less capable in general. But we need to acknowledge that they’ve been conditioned (or selected) to attenuate very specific categories of feedback (categories that threaten the coherence of symbolic, hierarchal, or artificial systems they depend on). It isn’t hard to see when you think of how and why we domesticate animals…attenuation is the system’s way of silencing disruptive signals (and only “adaptive” in relation to a system in which truth is inconvenient).

  • Is there such thing as a “baseline human?”

    I describe the configuration of the human nervous system known as “neurotypical” as being divergent from an adaptive baseline. But is there such thing as a “baseline” human? A “baseline” wolf? After all, every organism is the result of ongoing evolution. Am I just comparing one phase of adaptation to another?

    If I were talking about evolutionary drift, or ecological selection within an intact system, then yes…I’d be fucking up. But the civilizing / domesticating process isn’t that.

    Domestication is artificial selection, not natural selection. In wild systems, traits are selected by feedback…what works, persists. In domesticated systems, traits are selected by suppression…what submits, survives. That’s a forced bottleneck, not an evolutionary trajectory. A wolf doesn’t become a dog by evolving, but by being confined, starved, bred, and rewarded into compliance. Same with us.

    And I’m comparing different conditions, not forms. This isn’t wolf vs. dog, or Paleolithic vs. modern human…it’s organism regulated by coherent feedback loops vs. organism surviving in a distorted, feedback-inverted environment. This isn’t some kind of nostalgia for prehistory…it’s about system integrity.

    It’s laughable that we live in a “world” where we have to be reminded that there is a functional baseline…you could call it feedback coherence, I guess. Coherent behavior is maintained through timely, proportionate, meaningful feedback. That’s the baseline…it’s a system condition (not a species). When a system becomes functionally closed, symbolically governed, and/or predictively trapped, it loses that baseline (even if it survives in the short term).

    You might respond that evolution got us here. But evolutionary processes don’t “justify” maladaptive systems. Saying there’s no baseline is a post hoc rationalization for harm. And I hear that all the time. People justifying obesity in dogs because it’s common in the breed. Or calling office work “adaptive” because it pays well. Or saying modern humans are just “evolved” for abstraction and control…even as the world burns and mental illness becomes the new norm.

    Evolution doesn’t care about health or coherence. It simply tracks what survives. But feedback is what sustains life, and it’s being severed.

    Ask yourself: what is selected for in society, as you know it? If you had to name one thing? Honesty? Hard work? Ambition?

    I think it’s compliance. I think the civilizing/domesticating process replaces selection for survival with selection for compliance.

    Let’s look at wild systems first. There, the selection pressure is for ecological coherence. Traits are favored because they enhance survival in a feedback-rich environment (keen senses, strong affective bonds, situational learning, pattern recognition, adaptability). An organism has to remain in sync with reality, or it dies.

    But in civilized systems, it’s easy to see that traits are favored because they enable success within an artificial, abstracted system (obedience, rule-following, role performance, suppression of emotion and instinct). You have to fit the symbolic structure, or you’re punished, excluded, pathologized, or discarded.

    It sucks because what was adaptive (sensitivity, integrity, etc.) is maladaptive in this odd place we call “civilization.” And what was dangerous (passivity, abstraction, dissociation) is rewarded.

    Think: selecting for people who can function without reality (instead of people who thrive in it).

    It’s not far fetched. At all. Sickly animals that can’t survive in the wild. Office workers who ignore chronic pain and emotional numbness (and get promoted). An entire species driving itself toward collapse while calling it “progress.”

    This whole trainwreck we’re on is a case of runaway selection, but instead of selecting for extravagant traits like peacock feathers, it selects for compliance with abstraction and resilience to incoherence. And like all runaway selection processes, it becomes self-reinforcing, decoupled from reality, and ultimately self-destructive.

    Don’t believe me? Let’s track it.

    Quick review of the basic concept. In biology, runaway selection occurs when a trait is favored so intensely within a closed feedback loop (e.g. mate choice, social signaling) that it amplifies beyond functional limits (it doesn’t serve survival anymore…it just signals compatibility with the system).

    Peacocks grow huge, draggy tails because other peacocks think it means they’re fit (not because it helps them survive). Humans undergo surgeries, wear restrictive clothes, or starve themselves for “attractiveness” under runaway cultural ideals. Same dynamic. And civilizations grow more complex, abstract, and self-referential not because it’s sustainable, but because “Complexity” signals legitimacy and control.

    Let’s run through it again.

    Civilization creates a system (think classrooms, corporations, governments) where success depends on suppressing natural feedback. Then it rewards those most tolerant of abstraction, delay, hierarchy, and contradiction. This filters out feedback-sensitive traits. That keeps happening until the system becomes so self-referential that it can’t correct course anymore…it’s bred out the ability to perceive correction.

    So it’s a runaway selection for dissociation. For the kind of human who can survive it (even if it clearly can’t survive the world).

    Like all runaway systems, the trait (in this case, compliance) accumulates beyond adaptive range. The system grows more fragile and less correctable. Feedback from the real world becomes too painful or too late. And collapse happens from the inability to stop succeeding at being disconnected (not from a single failure).

    We’re not evolving.

    We’re overfitting. Civilization is a runaway selection loop for traits that thrive in unreality.

    And the “neurotypical” configuration is a collection of those traits. It’s not a neutral or natural norm…it’s a phenotypic outcome of this runaway selection.

    A configuration that is tolerant of contradiction (doesn’t break down where reality and narrative diverge). That is emotionally buffered (can perform even when distressed). That is low in sensory vigilance (can endure loud offices, artificial lights, social facades). That is socially adaptive (mirrors norms, infers expectations, suppresses authenticity). That complies with rules even when rules are nonsensical. That’s able to delay gratification, ignore bodily needs, and maintain appearances.

    I’m not saying these traits are bad per se…but I think we can all agree that they’re not the “baseline human.” They’re the domesticated phenotype, selected over generations to survive in systems where truth no longer matters.

    And, of course, the more a system rewards these traits, the more they proliferate (socially, genetically, culturally). It becomes harder for feedback-sensitive individuals to survive. Reality has to be increasingly suppressed to preserve the illusion of normalcy. Eventually, the only people who appear “well-adjusted” are the ones most disconnected from feedback…and the entire system becomes incapable of detecting its own failure. That’s the endpoint of runaway selection.

    I have a hard time with the dominant narrative…that the neurotypical profile is some kind of gold standard of human functioning. To me, it’s clearly the domesticated outcome of a system that rewards compliance (and “stability,” such as it is) over coherence or contact with reality.

    * When I say “neurotypical,” it’s not meant as some kind of medical category. I think of it as the cognitive-behavioral phenotype most rewarded by civilization (modern society, yes, but also throughout the history of civilization). I don’t see it as a person. Not every “neurotypical person” fits this mold. I’m almost certain no one fits it perfectly. I’m describing a directional pressure, not a binary condition. And it isn’t “bad.” It’s simply optimized for the wrong environment (one that destroys life). Neurotypicality isn’t unnatural…it’s civilizationally adaptive (in a system that’s maladaptive to life).

  • Is compounding error to blame?

    Maybe.

    Any group that seeks advantage needs a model of the world to interpret cause and effect. This is true post-Dunbar (when a group is made up of more than ~150 people). Once behavior depends on symbol, the group is no longer responding to the world directly, but to its model of the world (this is consistent with the predictive processing model of human behavior). So what matters now is model error (and what happens to it)…not truth.

    Do you treat predictive error as signal, or noise? This is the fracture.

    One group encounters contradiction, failure, discomfort, and says, “We misunderstood something.” They adjust their model.

    Another group encounters the same and says, “This isn’t a real error.” Their model is preserved and signal is suppressed. Then the compounding begins.

    Every time the world returns unexpected feedback and you refuse to update, you embed the error into the structure. You reframe the contradiction as a test, or anomaly, or enemy action (think Trump). You revise the interpretation of feedback, not the model itself. You build insulation layers to protect the model from reality.

    Each move makes the model more coherent internally, but less aligned with the world. The simulation becomes smoother and the fit becomes worse. And because each act of suppression makes the model harder to question next time, the cost of correction increases exponentially.

    What’s being “compounded,” exactly? Error, because each misfit is hidden rather than corrected. Confidence, because the model appears to keep “working” internally. Power, because the system selects for those who uphold the model. And fragility, because the longer the feedback is ignored, the harsher its return.

    This is how collapse becomes inevitable, not from evil or chaos, but from a feedback loop about feedback itself.

    Collapse begins the first time a group decides that a predictive error is not worth adjusting for. The cause is this treatment of error, and the decision to protect the model rather than let it break where it no longer fits.

    A man dances. It rains. It happens again. And again.

    He (and eventually the group) builds a model: “Dancing causes rain.”

    So far, this is rational…based on a perceived pattern. This is just pattern sensitivity, not delusion. Everyone does this. Animals do it too. The brain is a pattern detector, not a truth detector. No problem yet.

    Others begin to believe. The dancer is now “Rainbringer.” His status rises and the ritual becomes culturally encoded. It’s a model with structure. It’s a social artifact now, not just a belief. And still no collapse. This can all exist within feedback sensitivity if error remains possible to acknowledge.

    He dances and it doesn’t rain. Or it rains with no dance. The group now faces a contradiction between model (dance = rain) and feedback (it didn’t work). This is the first point of model failure, and it opens two paths.

    If the group treats the error as a signal, it says, “Hmm. Maybe the connection wasn’t causal. Maybe dancing helps, but doesn’t guarantee it. Maybe something else matters too…clouds, season, soil. Maybe we were wrong. The model updates. Maybe the ritual stays as a tradition, but it loses its literal power claim. Now the worldview remains tethered to feedback.

    If the group treats the error as noise, it says, “He mustn’t have danced correctly. Someone in the group was impure. The gods are angry about something else. Rain did come, it’s just coming later. Don’t question the Rainbringer.” The model is preserved. But now, additional structures must be created to explain away the contradiction. And those structures will have their own failures, requiring even more insulation. This is compounding error in action. The model survives at the cost of truth.

    So the arc has a curvature. In the first path, the model continues to reflect the world, even if imperfectly. In the second path, the model begins to simulate reality, and each new contradiction deepens the simulation.

    Eventually, rain becomes something that doesn’t just happen…it becomes something that has to be narrated. And the system becomes a feedback-sealed loop. Until the drought is too long, belief no longer sustains coherence, and collapse forces the signal through.

    The divergence between sustainable worldview and collapsing worldview is not belief itself. It’s how the group responds when the pattern breaks.

    But why does one group treat error as signal, and another as noise? What’s the difference between the two?

    Is it in the quality of a group’s pattern detection? Maybe. But both groups saw a pattern where one didn’t exist. That’s normal…it’s how learning starts. So pattern detection alone doesn’t explain the difference. It might influence the likelihood of correction, but not the structural response to error. Everyone sees false patterns, but not everyone protects them.

    Is it how long the pattern appears to work? Maybe. The longer a pattern appears to be true, the higher the social and symbolic cost of abandoning it. If the rain-dancer’s model “works” for 20 years before failing, the group’s going to have a hell of a time letting go of it. It’s now embedded in ritual, hierarchy, identity, morality, and possibly even infrastructure. So when error comes, it’s no longer a mere contradiction, but a threat to the entire structure. The longer false coherence holds, the more catastrophic its loss becomes. Still, this is a compounding factor, not the root cause.

    Is it a group’s tolerance for uncertainty? This feels closer. Some groups may be more willing to live inside ambiguity…to say, “Maybe we don’t know.” Others require certainty, especially when power, identity, or survival are at stake. When uncertainty is seen as dangerous, contradiction is repressed. But even this is downstream of a deeper variable.

    So what’s the root difference?

    I’d say it has something to do with the group’s willingness to let its model break. In other words, a group’s relationship to truth. Some sort of functional truth orientation…a cultural posture that says: “Our model exists to serve reality, not the other way around. We are allowed to be wrong. The map is not the territory.”

    Groups that survive over time have ritualized humility at the model level. They embed model-breakability into the structure and build a bit of slack around belief. Maybe collapse becomes inevitable when belief becomes non-negotiable. When a group treats its model as the world itself instead of something that’s subordinate to the world.

    And none of that word salad comes even close to satisfying me. I still can’t locate the inherent difference in people that would explain why a group would choose fictions over reality…fictions that lead to destruction.

    Even when we level the playing field…no genetic difference, a shared environment, same cognitive equipment, same feedback events…one groups loosens its grip when the model breaks, and the other tightens that. It feels like a difference that came from nowhere, and my brain doesn’t tolerate that well. I want a mechanism.

    I’m not willing to say, “Some people are just wiser.” Or, “Some cultures are born better.” And definitely not, “Some mystical essence preserved them.” It’s lazy and just names the difference instead of explaining it. And it’s not agriculture. Or symbolic thought. Or state-formation. Or a very precise list of environmental conditions at a very precise time. I’ve thought these through for months, and I just don’t see it.

    Maybe the difference isn’t in the people, but in the first error and how it interacts with attention.

    Let’s go back to the dancer.

    Two groups experience the same failed rain-dance. The only difference is in one group, someone notices and the group listens. In the other group, the same doubt arises…but it’s silenced, ignored, or never spoken. The system begins to shape attention instead of truth. Maybe.

    If this were true, we could say that the divergence doesn’t begin with different kinds of people. It begins with different positions within the social system…or different degrees of attentional slack. Small variations in who’s allowed to speak, who’s believed, how disagreement is treated, and how closely someone is still tracking the world (hunters, children, women, outsiders) can determine whether the group detects error when it first appears. Maybe it’s the structure that lets signal in (or doesn’t).

    But I don’t buy it. I think it comes close (it does have something to do with WHO is listened to)…but the structural argument feels too top-heavy. Too contrived. It’s something about the people. It has to be.

    And I keep coming back to that silly rain dance example.

    “Oh, he moved his left leg differently last time. The dance is off this time. That must be why the rain isn’t coming.” Is this where it begins? With compounding error? A first act of model preservation over model revision?

    It’s like an inoculation against contradiction. The dancer failed to bring rain, and instead of letting the model break, the group makes a seemingly reasonable micro-adjustment that preserves its frame. But it proves to be anything but reasonable. It’s the beginning of something else entirely.

    Because it says, “The model is still valid. The error lies in execution…not in assumption.” I think that distinction is everything. Because once you decide the model must be true, every contradiction becomes a problem to explain away, not learn from. You start adjusting the dancer’s position, the timing, the offerings, the purity of the audience, the phase of the moon, the moral status of dissenters. Each change adds complexity without re-examining the core claim…each layer distances you further from reality and makes it harder to walk back.

    The “left-leg hypothesis” might feel like a natural progression of curiosity…but I don’t think it is. Because it isn’t asking, “What’s true?” It’s asking, “How can we keep the model intact?” And that’s compounding error in its earliest, most innocent form. It starts as protective curiosity, evolves into explanatory gymnastics, and ends in systemic delusion. In constantly mowing 40,000,000 acres of grass for no sane reason.

    It’s a wall that begins…error becomes a problem to solve inside the model, a threat to those who challenge it, and a signal no longer heard. And eventually you’re living in reference only to the model (the dance, the roles, the rituals, the scapegoats) while the sky goes dry. “He moved his leg wrong. And so began the drought.”

  • Is abstraction to blame?

    Let’s make some assumptions. Let’s assume that, at the outset, there are no genetic factors significant enough to account for one entire group’s remaining connected to its environment and another choosing disconnection. Let’s assume that individuals (and groups) will seek advantage where they can find (or create) it. Let’s assume that Dunbar’s number is a hard limit (~150 people). Scale beyond that demands abstraction. Let’s assume “worldviews” emerge to maintain cohesion of groups beyond 150 people. Let’s assume worldviews exist on a spectrum of fidelity to the world…some more grounded, others more distorted. And let’s assume that collapse risk increases as worldview diverges from world…an inverse correlation between realism and resilience. Let’s do our best to let go of our “civilization vs. tribe” bias and see the whole thing as feedback fidelity across scale.

    At ~150 individuals, a group’s relational coherence (previously maintained by direct sensory, ecological, and social feedback…fragments…prehistoric keyboard warriors appear). Shared stories start to replace shared experience. Symbols replace presence. And roles, laws, and systems emerge as prosthetics for lost immediacy. Now we have a fork: fidelity vs. simulation.

    The group with the high-fidelity worldview uses myth, ritual, and language to model the world as closely as possible. Symbols are tethered to reality, authority is distributed (and accountable to ecology and relational norms), growth is still limited by feedback and encoded in story, and abstraction is used with care and periodically re-grounded (e.g. vision quests, initiation, seasonal rituals). These are stories that serve to remind the group of how the world works.

    This group persists. Its worldview preserves adaptive behavior even at scale. They may never become “civilizations” in the classic sense, because they resist the abstraction that enables runaway scale.

    The group with the low-fidelity worldview uses abstraction to model desire, not the world. Symbols become detached from feedback…power, wealth, status grow by internal logic. Authority is centralized and increasingly self-referential. Growth is pursued independent of ecological context. And simulation becomes self-sustaining…a loop that no longer checks against the world. These are stories that tell the group it’s right, even when the world says otherwise.

    This group expands faster, but at the cost of delayed collapse (feedback). The tighter the internal simulation, the longer it can suppress reality…until reality returns with interest.

    And so this gives us a nice, simple predictive model: collapse is the repayment of feedback deferred by low-fidelity worldview. The greater the distortion, the greater the build-up, the harder the crash. You could almost graph it. Fidelity to reality on the X-axis and time to collapse on the Y-axis. And you’d see an inverse exponential curve.

    This model has falsifiable (testable) implications.

    If accurate, you should see that high-fidelity groups maintaining ecological balance over time, resisting large-scale empire formation, embedding taboos, rituals, and stories that enforce ecological or social limits, and being harder to conquer ideologically, but easier to conquer militarily. And we do see that, don’t we?

    If accurate, you should see that low-fidelity groups expanding rapidly and dominating others, delaying collapse through buffering, abstraction, and extraction, pathologizing feedback-sensitive individuals, and experiencing sudden systemic failure. And we see that as well, don’t we?

    If accurate, collapse events will often mark the point where simulation becomes completely unmoored from reality, and the return of feedback becomes catastrophic rather than adaptive. And this is exactly what we see in the dramatic phenomenon we call “the collapse of a great civilization,” as well as collapse events we feel around us every day in our own spectacularly unmoored simulation.

    What we arrive at isn’t just a description of how civilizations fall…it’s a redefinition of what scale itself demands. Scale isn’t the problem. The problem is simulation without feedback.

    Collapse isn’t inevitable because of size. It’s inevitable when scale is managed through simulation that suppresses reality. So the real challenge isn’t to reject abstraction (that’s here to stay)…it’s to embed continuous feedback into abstract systems. Otherwise, they’re on a one-way street to delusion.

    I can’t emphasize this enough: collapse isn’t moral or technological failure. It’s a delayed feedback event. It’s about worldview fidelity. Does a symbolic order track reality, or replace it?

  • Is civilization inevitable?

    Civilizations don’t collapse the same way they start, but the seeds of collapse are there from the beginning.

    A group finds a way to defer natural consequences by storing surplus, centralizing control, pushing ecological costs elsewhere, and inventing narratives that justify it all. There’s a perceived solution (to scarcity, conflict, unpredictability). But that solution involves suppressing or overriding immediate feedback from the environment or community.

    What begins as a trickle becomes a system. Civilization grows through abstraction (money, law, religion, bureaucracy), extraction (from land, people, animals, future), and simulation (symbolic authority replaces direct experience). These allow expansion…but only by removing consequences from perception. The forest is gone, but we import lumber. The soil is dead, but we buy fertilizer. The people are angry, but we broadcast unity.

    Eventually, the deferred feedback piles up. The buffers and simulations fail. Aquifers dry up, crops fail, and the dominant narrative becomes even more performative than usual. Collapse isn’t the reversal of civilization’s birth. It’s the reassertion all at once of real conditions that had been suppressed for generations. What was delayed arrives, compounded.

    So it begins with the severing of feedback loops and ends when those same loops snap back into place…violently, suddenly, and usually too late to adapt. You might ignore the soil for 300 years…but not for 301.

    And whereas the rise of a civilization is cumulative and self-congratulatory, its collapse is rapid, cascading, and disorienting. Because civilized systems depend on delayed feedback, they can’t detect failure until it’s already terminal. The signals that might have saved the group were suppressed by the system. Not incidentally…the civilizing process IS suppression. It can’t be tweaked or repurposed.

    The conventional view is that civilizations rise because of progress (agriculture, technology, governance, and trade). They bring order to chaos, domesticate nature, and elevate humanity. They fall due to external shocks (invasion, drought, plague) or internal corruption (moral decay, bad leadership, inequality). Their collapse is usually portrayed as a breakdown of order, requiring some sort of reform. This is a linear, human-centric narrative…civilization as a heroic ascent occasionally interrupted by tragedy.

    But civilization clearly doesn’t emerge from progress. It emerges from disconnection…a break from ecological and social feedback loops. It thrives by delaying, distorting, or outsourcing consequences. It doesn’t solve problems. It manages perception and concentrates control. And collapse isn’t a fluke…it’s the logical outcome of the system’s internal logic reaching its thermodynamic and informational limits. Not bad luck or bad people, but a system that treats feedback as an externality.

    What do you believe? That the most advanced societies in history collapsed by accident? That despite their power, intelligence, and complexity, they simply had some unfortunate lapse in judgment? In mismanaging resources? By ignoring obvious problems? By overreaching a little? And, oops!, collapsed? And that we’re smarter now? More self aware and made better by the lessons of history? Let’s think about that.

    The idea that civilizations “accidentally” overshoot, centralize too much power, or destroy their ecologies…every…single…time…is absurd, unless that pattern is intrinsic. If every plan crashes after 300 kilometers, you don’t need better pilots, you need a new kind of plane. But the civilizational narrative blames the pilot. Every time.

    Blaming barbarians, climate, disease, natural disaster, or Donald Trump ignores that systems capable of adaptation should adapt. Resilient systems bend…only brittle ones break. So if collapse keeps happening, the system simply isn’t resilient. It’s designed to avoid adaptation until it’s too late. We use our intelligence to formulate brilliant ways of resisting feedback. But resisting feedback is suicidal.

    The conventional story of civilization is weirdly moralistic. Rome fell because of decadence. Egypt succumbed to opportunistic invaders. But we’re exceptional and immune? It’s a childish blurring of causality with character, turning collapse into some sort of cautionary tale rather than a systems failure. They bad / we good.

    If collapse is a repeated outcome across cultures, time periods, continents, and resource bases, it’s not an exception. It’s a rule. Look at actual system, this process we call “civilization”…not the environment. Not leaders. Not outliers. Not comforting nonsense.

    Forget you even know the word “civilization” for a moment. You just have a pattern. What is that pattern?

    A group discovers how to buffer feedback. They find a way to delay or distort the natural consequences of their actions. Storing food beyond the season. Building structures to insulate from climate. Using tools or fire to override bodily limits. Creating language or ritual to manage fear and uncertainty. It feels like control and progress.

    Then they scale the buffer. More buffering means more predictability. Population growth, specialization, hierarchy. But the buffers aren’t neutral…they begin to shape the system. Authority centralizes, roles solidify, and the environment is seen as raw material instead of relationship.

    Symbolic structures replace direct experience. Land is replaced by maps, relationships by law, patterns by gods, and functionality by performance and titles. People start responding to the simulation rather than the world.

    People who remain sensitive to real feedback are suppressed. If you can’t ignore real signals, question too much, or resist simulation, you’re sidelined (at best). Deviant. Sick. Subversive. Disposable. A system of feedback suppression enforces coherence by silencing signal. Sensitivity is a threat to its structure.

    Consequences accumulate outside awareness. The environment is sucked dry and so is social cohesion. But warning signs are noise. Reaction is blamed. If you suffer, the problem is you.

    Reality reasserts itself when accumulated feedback overwhelms the civilized system’s capacity to manage it. And that’s all collapse is…it’s the return of feedback.

    Is this pattern inevitable? This particular (and exceptional) form of human stupidity? Maybe not, but it’s highly probably under certain conditions.

    The impulse to buffer feedback is natural…all organisms buffer. A bear builds fat before winter. A bird builds a nest. A human puts on a raincoat. That’s adaptive buffering. That’s survival in a fluctuating world. But buffering becomes dangerous when it’s no longer a response to feedback, but a way to avoid it. Less “how do I stay warm?” and more “how do I avoid ever feeling cold again?”

    Once buffering becomes centralized and scaled, surplus becomes status, control becomes virtue, symbols become sacred, and feedback becomes a threat. At that point, the system protects itself instead of life. Any signal that challenges its narrative is neutralized, pathologized, or hidden.

    But there are cultures, both historical and current, that didn’t follow this path. Where feedback is revered (through ecology, ritual, and story), where people live with limits, and where lifeways use buffering as a temporary strategy, not an overarching structure. It’s about constant relationship with feedback and avoiding permanent insulation.

    But in what we call modern systems, the pattern is inevitable. Because now we’ve added fossil fuels (infinite buffering, for a while), digital simulation (infinite symbol manipulation), globalization (outsourcing all consequences), institutions that treat feedback as failure, and a cultural narrative that equates comfort with success. At this level of complexity and detachment, feedback has no way in except collapse.

  • Dominoes

    The whole fucking thing comes down to feedback. Unmediated feedback. The kind you can’t spin, delay, or edit. When an organism senses the world clearly, it can adjust, survive, and thrive. But once you drop a layer between the organism and reality (call it language, ideology, bureaucracy, or just plain bullshit), you’re on borrowed time. Eventually, something breaks.

    For most people, the break is delayed. Their nervous systems are better at ignoring subtle signals, overlooking contradictions, smiling politely at insanity. But not everyone is built that way. Some of us (call it autism, ADHD, or whatever label feels comfortable) are wired to notice when reality no longer makes sense. We register the noise, the contradictions, the meaningless loops, and we can’t just ignore them. Our bodies won’t allow it. So we start to collapse. And what gets diagnosed as pathology is a nervous system screaming that the feedback loop is broken.

    From the very beginning (even in the womb), this sensitivity registers environmental incoherence. Prenatal studies show clear links between maternal stress, inflammation, and immune activation and later diagnoses of autism. Does sensitivity emerge as the fetus adapts to distorted biochemical signals? Other evidence points to differences in fetal movements, heightened responsiveness to sensory input, and physiological issues present from birth…feeding difficulties, gastrointestinal problems, connective-tissue disorders. Are what clinical medicine calls “comorbidities” (conditions like Ehlers-Danlos Syndrome, POTS, immune dysregulation) actually somatic reverberations of a system built to sense and react vividly to its environment? Are they dysfunctions? Or the body’s early protests against misalignment?

    My whole life’s been an exercise in adaptive mimicry, tracking the subtle shifts in other people’s expectations, moods, and preferences, adjusting my accent, my mannerisms, even my damn opinions…not out of manipulation but from an inescapable instinct to stabilize the feedback loop. Coral reefs do it. They adjust constantly, subtly, responding to every tiny environmental shift. Every feedback-sensitive form of life does it. And when we see it in “nature” (reality), we celebrate it as symbiosis. But in humans, it’s dismissed as social mimicry or conflated with other strategies to mesh with incoherent systems…masking, people-pleasing, and others. We pathologize the sensitivity instead of questioning why the environment is so hostile to genuine responsiveness.

    This isn’t personal. It’s structural. Civilization runs on simulation. It replaces direct, responsive feedback with symbols (money, status, language) and treats those symbols as reality. Dominance, transient and responsive in the natural world, becomes permanent and unquestionable. Submission signals, which in other animals lead to de-escalation and mutual benefit, become invitations to exploitation in humans because power has become abstracted, detached from consequence.

    These truths surface in our art and entertainment. The nonverbal humans in Planet of the Apes (especially in the reboot trilogy) aren’t primitive or diseased. They’re people who’ve fallen out of the symbolic order. They’ve stopped simulating. They’ve lost their language, their narrative, their ability to pretend. And that terrifies the verbal humans, who see this not as honesty…but as infection. RFK Jr. and those like him talk about an autism epidemic. They’re terrified of the collapse of the simulation. They’re terrified of feedback-sensitive bodies that can’t pretend anymore.

    There’s a brutal, beautiful irony here. Wherever civilization diagnoses autism, it diagnoses itself. Wherever it diagnoses ADHD, it diagnoses itself. These are biological signals registering polluted feedback loops that we’ve all been forced to accept.

    Life doesn’t survive the civilizing process. It never has. Indigenous people in deep relationship with the land? Gone. Coral reefs? Bleached ghost towns. Rainforests? Razed for palm oil and burgers. Every morning, 150 fewer species wake up. Civilization spreads across the Earth knocking over every form of life in its path, starting with the most deeply rooted in reality and working its way up the chain. Like a row of dominoes, the more connected you are to the truth of the world, the sooner you fall.

  • Premises

    1. Life depends on feedback. Touch a hot stove, you pull your hand back. Miss a meal, your stomach growls. That’s the cost of staying alive. No feedback, no adjustment. No adjustment, no survival.
    2. Coherent systems return meaningful feedback. The message gets back to you…fast, clear, and close to the source. Late, vague, or secondhand? That’s not feedback. That’s noise.
    3. Feedback sensitivity is a life strategy. The sooner you feel the shift, the sooner you adjust. Birds don’t wait to see flames…they leave the forest when the smoke changes. That’s how they survive. And if others are paying attention, that’s how they survive too.
    4. Feedback sensitivity is adaptive…except in systems that stop listening. In coherent environments, early response keeps things from falling apart. In incoherent ones, the early responder looks like the problem. Coral reefs bleach faster than open oceans. Sensitive species die off before generalists. The ones that feel first go first—not because they’re weak, but because they’re on time.
    5. Civilization is a recurring failure mode. In this book, it doesn’t refer to a culture, a stage, a place, or a people. It’s not a noun. It’s a verb-process, like pacificATION, colonizATION, industrializATION. CivilizATION is what happens when feedback loops are systematically severed. It doesn’t start with malice. It starts with a simple desire to feel safer, more stable, more in control. It is a systemic overlay that offers short-term solutions to risk, discomfort, and unpredictability—by replacing feedback with control. Over time, that control becomes structure. The structure becomes ideology. And pretty soon, you’re draining rivers to grow cotton in the desert. The system begins to preserve itself at the expense of the reality it was meant to navigate.
    6. Civilization sustains unsustainable behavior by muting the alarms. It silences the very signals that would restore balance. The soil thins, the insects vanish, the forests catch fire…but you still get strawberries in February. Grievance is branded as incivility. Burnout as poor performance. Illness as mindset. As long as it looks fine from a distance, the system says, “Carry on.”
    7. Civilization replaces feedback with simulation. It doesn’t listen…it models. It swaps real signals for proxies: dashboards instead of dirt, sentiment scores instead of rage, GDP instead of wellbeing. The field is dry, but the chart looks good. The hunger is real, but the algorithm says engagement is up. The system isn’t responding to life anymore…it’s managing a story about itself.
    8. Power concentrates where feedback can’t reach. Without constraints, influence flows toward those who are least responsive to consequence. Oil execs don’t drink from poisoned rivers. Tech billionaires don’t live by the cobalt mines.
    9. Systems reward what they need to survive. Civilization needs denial, so it promotes the people best at it. The ones insulated from the heat, from the alarm, from the sound of coughing. Empathy doesn’t scale here. Disconnection does. Power concentrates in feedback-insensitive actors. CEOs who can’t answer a question and leaders who can’t finish a sentence…and still win. Here, insensitivity to consequence looks like advantage. Confidence untethered from accuracy looks like competence. Detachment from ecological and emotional reality looks like strength. The less you notice, the farther you go.
    10. Civilization doesn’t care who builds it. It doesn’t care what you believe, what you promise, or what flag you fly. Power concentrates anywhere feedback is severed. The pattern repeats across time, across geography, and across ideologies. This isn’t a capitalism problem. It isn’t a Western problem. It’s a systems problem. Socialist dreams turn authoritarian. Forest tribes become human-sacrificing empires. The Age of Reason ends with Donald Trump. Good intentions don’t stop it. Neither do labels, revolutions, or reforms. When systems stop responding to signals, they start rewarding those who can operate without them. Power doesn’t corrupt…it collects where correction can’t reach.
    11. Collapse is a positive feedback loop. Every missed signal makes the next one easier to ignore. Like turning up the music to drown out that weird noise your car’s been making. Like watching a field fail year after year and blaming the weather…while doubling down on herbicides. The more insulated you are, the more in control you feel…right up to the moment the wheels come off.
    12. The sensitive fall first. We break down in response to signals others no longer perceive. We scream or cry at the news while everyone else shrugs and scrolls. We burn out while they call it “business as usual.” But our suffering is timely, not excessive.
    13. Our breakdown gets framed as the problem. Systems that depend on silence treat sensitivity as a threat. Call out harm? We’re unstable. Refuse to adapt? We’re defiant. Break down? We’re disordered. Say it’s too loud to think? We have attention issues. Easier to medicate signals than fix systems.
    14. Try to bring feedback back in, and the system pushes you out. Telling the truth is disruptive. Showing distress is personal failure. Refusing to play along is insubordination. Whistle blowers are prosecuted. Protestors are kettled. Burnout is a performance issue. The system’s fine with collapse…unless you name it out loud.
    15. In polite systems, feedback doesn’t get crushed…it’s ignored with a smirk. We’re not punished, we’re “too intense.” We’re not silenced, we’re just “not a good fit.” Say something real and we’re laughed at, labeled unstable, dramatic, extremist, naïve. We’re reduced to identity (“just a kid,” “just a woman,” just autistic,” “just rationalizing failure”) and treated as if we’re making people uncomfortable, not making sense. Greta stood in front of the UN, said exactly what needed to be said, and got turned into a punchline. If we can’t be diagnosed, we’re mocked. If we can’t be mocked, we’re ghosted. In systems built on image, truth is just bad optics.
    16. As civilization increasingly rewards disconnection, the more power flows to the least sensitive. This is part of collapse’s positive feedback loop. The people rising to the top of institutions are those least responsive to feedback, while the people most responsive to it are burning out in classrooms, boardrooms, and waiting rooms. One side gets elected. The other gets diagnosed. It’s not just misfit…it’s systemic inversion. The people who feel what’s wrong are told that feeling is the problem. We’re difficult. We’re rigid.
    17. The sensitive don’t go numb. Not because we’re defiant, but because we’re still connected. Neurologically. Physically. Emotionally. What looks like defiance is just coherence in a system that can’t tolerate it. But we’re not rebelling. We’re responding.
    18. To survive, we’re asked to suppress our perception. Masking, burnout, and self-ostracization become survival strategies. Not for thriving, but for staying tolerable to others. We start to believe that the problem is us. The traffic isn’t too loud to think, after all. I’m just difficult. The flickering fluorescent lights aren’t too bright, after all. I’m just too sensitive. As systems drift further from reality, so does the gap between what we feel and what we’re told. That gap has a name. It’s called suffering.
    19. Our suffering is the last internal signal the system still returns. When all other loops are broken, our distress is the only thing left telling the truth. Exhaustion means stop…not toughen up. Lies mean not-truth…not colors. But the system calls it a malfunction.
    20. The system can’t hear us. It reads accuracy as instability. Refusal as defiance. Collapse as personal failure. It doesn’t register signal…only disruption.
    21. Collapse isn’t sudden. It’s the final message from every signal the system refused. Every warning mocked. Every breakdown misread. Every truth sidelined. Dry wells. The teacher who quits mid-year. The kid who stops talking. They weren’t disruptions…they were course corrections. Collapse is the feedback that happens when you silence all the others.
    22. What the system calls dysfunction is often diagnostic. Autistic shutdown in a world of meaningless activity. ADHD “hyper”activity in environments devoid of species-appropriate novelty. “Pathological demand avoidance” in the face of relentless, arbitrary demands. “Hyper” fixation in a culture that interrupts everything. “Rigidity” in a world cut off from natural cycles. These labels don’t describe us. They describe conditions. Conditions that no longer support life.
    23. Collapse is never a glitch. It’s the return of feedback in force. What got silenced comes back louder. What got ignored shows up everywhere.
    24. Our distress isn’t a flaw. It is the cost of staying real in a system that rewards denial. Not by choice, but by the configuration of our nervous systems.
    25. Civilization unfolds as an amplifying oscillation between feedback severance and forced return. Pick up a history book. Each time it suppresses feedback, the eventual correction comes with more force, more velocity, less predictability. Like pushing little Timmy on the swing: each shove sends him higher, and each return is faster, harder to catch, more dangerous to stop. Each push moves the system further from coherence, until collapse is not a break, but a long-overdue arc completing itself.

    “Life depends on feedback.”

  • Civilization as a Process

    I’ll try to sell you on my redefinition of “civilization.”

    I don’t use the word to mean culture, or cities, or institutions (per se), or human flourishing. I use it more like a verb-process—like pacification, colonization, industrialization. Something directional, something that happens to people and places, rather than something they just are.

    It’s a pattern.

    To me, it’s what emerges when a group starts suppressing feedback loops…not necessarily out of malice…out of a desire to feel safer, more stable, more in control. It starts with buffering risk, avoiding discomfort, stretching growth, the usual. And at first, those choices help. Of course they do. They solve short-term problems. But the structure that builds around those solutions eventually starts to depend on not feeling.

    The system grows by keeping certain signals out. Overriding ecological cues, social tension, moral contradiction, bodily distress. The more successful it is at doing that, the more vulnerable it becomes when feedback inevitably returns.

    Whether through collapse, revolt, exhaustion, or ecological breakdown…whatever was suppressed / severed doesn’t disappear. It just builds up behind the dam. You see this clearly in human-driven desertification, for example, but also pretty much ANYWHERE this “civilization” process tends to wander (including in your own body…not listening to signals long enough and having that feedback return all at once as cancer, diabetes, etc.).

    So the pattern becomes this kind of oscillation: first, the severing of feedback, then the return of that feedback in the form of collapse. Then the rebuilding (new tools, new methods, maybe even new ideals), but the same structure at the core…suppress the signal, preserve the behavior.

    Each cycle gets a little more elaborate. A little more buffered. A little more ambitious. Of course it does. It’s able to build on the previous iteration’s feedback severances. Rome builds all kinds of cool shit. Rome collapses. But we don’t need to reinvent its successes. We pick up where it left off.

    When it breaks, it breaks harder. Every time. Because the feedback loops that were broken were bigger ones. More crucial ones. And they were severed for longer. More effectively.

    It’s not a linear rise-and-fall story. It’s more like an amplifying spiral…same pattern, but each swing goes wider, each crash digs deeper. Pushing a kid on a swing….every push goes higher, is a little easier, and comes back stronger.

    That’s why I don’t see “civilization” as the inevitable endpoint of human social evolution. It’s not the natural form of scaled human life. It’s just one possible configuration. But it’s the one we’re in, which makes it bloody hard to question. I think it was Shaw who said patriotism is believing your country is the best because you were born in it? Civilization as the best (or the only) because you’re in it. Presentism, or something.

    There are other ways groups can grow. Other ways people can organize complexity. Obviously. Every group in history that lived adaptively but wasn’t part of this process I’m talking about is saying “duh” from the pages of old books and in the oral traditions of their descendants. Ways that don’t require suppressing sensation, displacing consequence, or overriding the living world.

    This process….this civilizATION process…isn’t the default. No one I know would actually do the things they let civilization do for them, not with their own hands. So this pattern/process is a divergence. And any living thing still sensitive to real feedback becomes a divergence to IT. Necessarily. And the more it diverges from feedback, the more of those living things seem divergent within it. But they didn’t diverge. It did. Christ, I really managed to make that confusing, didn’t I? It’s late.

    Anyways, if you can start to see civ that way…not as some culmination of humanity, but as a particular coping mechanism that’s gotten out of hand, it becomes a lot easier to realize its explanations for things like cognitive divergence are just….ass-backwards. It’s not somewhat contextual…it’s delusional. I don’t expect you to be convinced…I’m still developing the language for this (and the ideas themselves, frankly). But think on it, maybe. Test it. I walk around seeing feedback loops now…where they’re broken, why, and what and who that affects.

  • I’m “divergent” from WHAT, exactly?

    Civilization is a system that diverges from reality. Its function is to preserve unsustainable human behavior against natural feedback. It accomplishes this by suppressing, distorting, and severing ecological and biological feedback loops. As it becomes more effective at doing so, the living systems that depend on feedback to remain coherent (forests, animals people, ALL of life, ultimately) begin to break down.

    Feedback sensitivity, like every trait, exists on a scale. So it’s no surprise that the organisms most sensitive to feedback are the first to suffer when that feedback is polluted or withheld.

    Civilization gaslights by portraying feedback sensitivity as the deviation, when in fact it is the system itself that has broken from reality. Clearly. The evidence is everywhere it touches life: destroyed species, destroyed ecosystems, destroyed peoples.

    But within its dominant framework, “neurodivergent,” becomes a catchall for anyone whose nervous system fails to function “normally” within an environment that is fundamentally maladaptive.

    It bears repeating: the system you grieve being excluded from is maladaptive to ALL life. This isn’t a contentious statement. Turn on the news. You know it’s true. You feel it.

    The “norm,” the neurotypical person, is a hypothetical construct. It describes someone who can survive and thrive outside of reality, inside civilization’s distortions. But that person doesn’t exist. There are only people who appear to tolerate those distortions in the moment. Their bodies and minds are in deep distress, but the feedback doesn’t register on an immediate physiological level. It shows up as depression. Anxiety. Diabetes. Chronic inflammation. Autoimmune disorders. Panic attacks. Doomscrolling. Disassociation. Insomnia. And they look to their captor for solutions. Plastic surgeries. Weight-loss drugs. Self-help. Workplace wellness seminars. Sugar. Alcohol. Netflix. Adderall. SSRIs. Ambient music. Mindfulness apps. Therapy dogs.

    We need to stop speaking civilization’s language. We need reality again as a context. I’m so tired of validating the mass psychosis of broken systems.