Tag: artificial-intelligence

  • Is compounding error to blame?

    Maybe.

    Any group that seeks advantage needs a model of the world to interpret cause and effect. This is true post-Dunbar (when a group is made up of more than ~150 people). Once behavior depends on symbol, the group is no longer responding to the world directly, but to its model of the world (this is consistent with the predictive processing model of human behavior). So what matters now is model error (and what happens to it)…not truth.

    Do you treat predictive error as signal, or noise? This is the fracture.

    One group encounters contradiction, failure, discomfort, and says, “We misunderstood something.” They adjust their model.

    Another group encounters the same and says, “This isn’t a real error.” Their model is preserved and signal is suppressed. Then the compounding begins.

    Every time the world returns unexpected feedback and you refuse to update, you embed the error into the structure. You reframe the contradiction as a test, or anomaly, or enemy action (think Trump). You revise the interpretation of feedback, not the model itself. You build insulation layers to protect the model from reality.

    Each move makes the model more coherent internally, but less aligned with the world. The simulation becomes smoother and the fit becomes worse. And because each act of suppression makes the model harder to question next time, the cost of correction increases exponentially.

    What’s being “compounded,” exactly? Error, because each misfit is hidden rather than corrected. Confidence, because the model appears to keep “working” internally. Power, because the system selects for those who uphold the model. And fragility, because the longer the feedback is ignored, the harsher its return.

    This is how collapse becomes inevitable, not from evil or chaos, but from a feedback loop about feedback itself.

    Collapse begins the first time a group decides that a predictive error is not worth adjusting for. The cause is this treatment of error, and the decision to protect the model rather than let it break where it no longer fits.

    A man dances. It rains. It happens again. And again.

    He (and eventually the group) builds a model: “Dancing causes rain.”

    So far, this is rational…based on a perceived pattern. This is just pattern sensitivity, not delusion. Everyone does this. Animals do it too. The brain is a pattern detector, not a truth detector. No problem yet.

    Others begin to believe. The dancer is now “Rainbringer.” His status rises and the ritual becomes culturally encoded. It’s a model with structure. It’s a social artifact now, not just a belief. And still no collapse. This can all exist within feedback sensitivity if error remains possible to acknowledge.

    He dances and it doesn’t rain. Or it rains with no dance. The group now faces a contradiction between model (dance = rain) and feedback (it didn’t work). This is the first point of model failure, and it opens two paths.

    If the group treats the error as a signal, it says, “Hmm. Maybe the connection wasn’t causal. Maybe dancing helps, but doesn’t guarantee it. Maybe something else matters too…clouds, season, soil. Maybe we were wrong. The model updates. Maybe the ritual stays as a tradition, but it loses its literal power claim. Now the worldview remains tethered to feedback.

    If the group treats the error as noise, it says, “He mustn’t have danced correctly. Someone in the group was impure. The gods are angry about something else. Rain did come, it’s just coming later. Don’t question the Rainbringer.” The model is preserved. But now, additional structures must be created to explain away the contradiction. And those structures will have their own failures, requiring even more insulation. This is compounding error in action. The model survives at the cost of truth.

    So the arc has a curvature. In the first path, the model continues to reflect the world, even if imperfectly. In the second path, the model begins to simulate reality, and each new contradiction deepens the simulation.

    Eventually, rain becomes something that doesn’t just happen…it becomes something that has to be narrated. And the system becomes a feedback-sealed loop. Until the drought is too long, belief no longer sustains coherence, and collapse forces the signal through.

    The divergence between sustainable worldview and collapsing worldview is not belief itself. It’s how the group responds when the pattern breaks.

    But why does one group treat error as signal, and another as noise? What’s the difference between the two?

    Is it in the quality of a group’s pattern detection? Maybe. But both groups saw a pattern where one didn’t exist. That’s normal…it’s how learning starts. So pattern detection alone doesn’t explain the difference. It might influence the likelihood of correction, but not the structural response to error. Everyone sees false patterns, but not everyone protects them.

    Is it how long the pattern appears to work? Maybe. The longer a pattern appears to be true, the higher the social and symbolic cost of abandoning it. If the rain-dancer’s model “works” for 20 years before failing, the group’s going to have a hell of a time letting go of it. It’s now embedded in ritual, hierarchy, identity, morality, and possibly even infrastructure. So when error comes, it’s no longer a mere contradiction, but a threat to the entire structure. The longer false coherence holds, the more catastrophic its loss becomes. Still, this is a compounding factor, not the root cause.

    Is it a group’s tolerance for uncertainty? This feels closer. Some groups may be more willing to live inside ambiguity…to say, “Maybe we don’t know.” Others require certainty, especially when power, identity, or survival are at stake. When uncertainty is seen as dangerous, contradiction is repressed. But even this is downstream of a deeper variable.

    So what’s the root difference?

    I’d say it has something to do with the group’s willingness to let its model break. In other words, a group’s relationship to truth. Some sort of functional truth orientation…a cultural posture that says: “Our model exists to serve reality, not the other way around. We are allowed to be wrong. The map is not the territory.”

    Groups that survive over time have ritualized humility at the model level. They embed model-breakability into the structure and build a bit of slack around belief. Maybe collapse becomes inevitable when belief becomes non-negotiable. When a group treats its model as the world itself instead of something that’s subordinate to the world.

    And none of that word salad comes even close to satisfying me. I still can’t locate the inherent difference in people that would explain why a group would choose fictions over reality…fictions that lead to destruction.

    Even when we level the playing field…no genetic difference, a shared environment, same cognitive equipment, same feedback events…one groups loosens its grip when the model breaks, and the other tightens that. It feels like a difference that came from nowhere, and my brain doesn’t tolerate that well. I want a mechanism.

    I’m not willing to say, “Some people are just wiser.” Or, “Some cultures are born better.” And definitely not, “Some mystical essence preserved them.” It’s lazy and just names the difference instead of explaining it. And it’s not agriculture. Or symbolic thought. Or state-formation. Or a very precise list of environmental conditions at a very precise time. I’ve thought these through for months, and I just don’t see it.

    Maybe the difference isn’t in the people, but in the first error and how it interacts with attention.

    Let’s go back to the dancer.

    Two groups experience the same failed rain-dance. The only difference is in one group, someone notices and the group listens. In the other group, the same doubt arises…but it’s silenced, ignored, or never spoken. The system begins to shape attention instead of truth. Maybe.

    If this were true, we could say that the divergence doesn’t begin with different kinds of people. It begins with different positions within the social system…or different degrees of attentional slack. Small variations in who’s allowed to speak, who’s believed, how disagreement is treated, and how closely someone is still tracking the world (hunters, children, women, outsiders) can determine whether the group detects error when it first appears. Maybe it’s the structure that lets signal in (or doesn’t).

    But I don’t buy it. I think it comes close (it does have something to do with WHO is listened to)…but the structural argument feels too top-heavy. Too contrived. It’s something about the people. It has to be.

    And I keep coming back to that silly rain dance example.

    “Oh, he moved his left leg differently last time. The dance is off this time. That must be why the rain isn’t coming.” Is this where it begins? With compounding error? A first act of model preservation over model revision?

    It’s like an inoculation against contradiction. The dancer failed to bring rain, and instead of letting the model break, the group makes a seemingly reasonable micro-adjustment that preserves its frame. But it proves to be anything but reasonable. It’s the beginning of something else entirely.

    Because it says, “The model is still valid. The error lies in execution…not in assumption.” I think that distinction is everything. Because once you decide the model must be true, every contradiction becomes a problem to explain away, not learn from. You start adjusting the dancer’s position, the timing, the offerings, the purity of the audience, the phase of the moon, the moral status of dissenters. Each change adds complexity without re-examining the core claim…each layer distances you further from reality and makes it harder to walk back.

    The “left-leg hypothesis” might feel like a natural progression of curiosity…but I don’t think it is. Because it isn’t asking, “What’s true?” It’s asking, “How can we keep the model intact?” And that’s compounding error in its earliest, most innocent form. It starts as protective curiosity, evolves into explanatory gymnastics, and ends in systemic delusion. In constantly mowing 40,000,000 acres of grass for no sane reason.

    It’s a wall that begins…error becomes a problem to solve inside the model, a threat to those who challenge it, and a signal no longer heard. And eventually you’re living in reference only to the model (the dance, the roles, the rituals, the scapegoats) while the sky goes dry. “He moved his leg wrong. And so began the drought.”

  • Is technology to blame?

    We know that as worldview fidelity decreases, time to collapse shortens. But what bends the line? What actually introduces feedback distortion or delay.

    Let’s look at technology, because it complicates things. It doesn’t break the above model, but it introduces time lags and feedback insulation.

    At its core, technology is a buffer. It extends capacity, softens consequences, and postpones the return of feedback. Irrigation lets you farm longer before drought matters. Antibiotics let you survive behaviors that used to kill you. Fossil fuels let you scale production far beyond ecological yield. The pain that wouldn’t corrected your behavior is deferred.

    So low-fidelity worldviews survive longer if backed by high-powered technology. Collapse is delayed, not avoided. The worldview says, “We’re right.” The tech says, “We’ll make it look that way…until we can’t.”

    But tech doesn’t just delay feedback…it also creates false signals. GPS replaces intimate knowledge of land. Social media simulates community. Processed food simulates nutrition. Air conditioning simulates a habitable climate. This builds confidence in the system, even as it drifts further away from reality. “Look how well it’s working!” (Says the thermostat on a house with a collapsing foundation.) It enables deeper detachment from feedback, which enables more elaborate simulation.

    But is technology neutral? Clearly its effects depend on the worldview using it.

    In high-fidelity cultures, technology extends sensitivity, preserves balance, and enhances feedback clarity (e.g. indigenous fire-stick farming, soil renewal techniques, wind-based navigation).

    In low-fidelity cultures, technology conceals damage, extracts faster, delays correction (e.g. industrial agriculture, geoengineering, financial modeling). Tech isn’t a villain…but in hands of a distorted worldview, it’s something of a sorcerer’s apprentice.

    Here’s the twist: tech amplifies either trajectory. It’s an amplifier, not a course corrector. It can scale either sustainability or simulation / collapse. It gives a low-fidelity culture (like the one we’re part of) more time and reach, but also makes the eventual collapse larger and more system-wide.

  • Premises

    1. Life depends on feedback. Touch a hot stove, you pull your hand back. Miss a meal, your stomach growls. That’s the cost of staying alive. No feedback, no adjustment. No adjustment, no survival.
    2. Coherent systems return meaningful feedback. The message gets back to you…fast, clear, and close to the source. Late, vague, or secondhand? That’s not feedback. That’s noise.
    3. Feedback sensitivity is a life strategy. The sooner you feel the shift, the sooner you adjust. Birds don’t wait to see flames…they leave the forest when the smoke changes. That’s how they survive. And if others are paying attention, that’s how they survive too.
    4. Feedback sensitivity is adaptive…except in systems that stop listening. In coherent environments, early response keeps things from falling apart. In incoherent ones, the early responder looks like the problem. Coral reefs bleach faster than open oceans. Sensitive species die off before generalists. The ones that feel first go first—not because they’re weak, but because they’re on time.
    5. Civilization is a recurring failure mode. In this book, it doesn’t refer to a culture, a stage, a place, or a people. It’s not a noun. It’s a verb-process, like pacificATION, colonizATION, industrializATION. CivilizATION is what happens when feedback loops are systematically severed. It doesn’t start with malice. It starts with a simple desire to feel safer, more stable, more in control. It is a systemic overlay that offers short-term solutions to risk, discomfort, and unpredictability—by replacing feedback with control. Over time, that control becomes structure. The structure becomes ideology. And pretty soon, you’re draining rivers to grow cotton in the desert. The system begins to preserve itself at the expense of the reality it was meant to navigate.
    6. Civilization sustains unsustainable behavior by muting the alarms. It silences the very signals that would restore balance. The soil thins, the insects vanish, the forests catch fire…but you still get strawberries in February. Grievance is branded as incivility. Burnout as poor performance. Illness as mindset. As long as it looks fine from a distance, the system says, “Carry on.”
    7. Civilization replaces feedback with simulation. It doesn’t listen…it models. It swaps real signals for proxies: dashboards instead of dirt, sentiment scores instead of rage, GDP instead of wellbeing. The field is dry, but the chart looks good. The hunger is real, but the algorithm says engagement is up. The system isn’t responding to life anymore…it’s managing a story about itself.
    8. Power concentrates where feedback can’t reach. Without constraints, influence flows toward those who are least responsive to consequence. Oil execs don’t drink from poisoned rivers. Tech billionaires don’t live by the cobalt mines.
    9. Systems reward what they need to survive. Civilization needs denial, so it promotes the people best at it. The ones insulated from the heat, from the alarm, from the sound of coughing. Empathy doesn’t scale here. Disconnection does. Power concentrates in feedback-insensitive actors. CEOs who can’t answer a question and leaders who can’t finish a sentence…and still win. Here, insensitivity to consequence looks like advantage. Confidence untethered from accuracy looks like competence. Detachment from ecological and emotional reality looks like strength. The less you notice, the farther you go.
    10. Civilization doesn’t care who builds it. It doesn’t care what you believe, what you promise, or what flag you fly. Power concentrates anywhere feedback is severed. The pattern repeats across time, across geography, and across ideologies. This isn’t a capitalism problem. It isn’t a Western problem. It’s a systems problem. Socialist dreams turn authoritarian. Forest tribes become human-sacrificing empires. The Age of Reason ends with Donald Trump. Good intentions don’t stop it. Neither do labels, revolutions, or reforms. When systems stop responding to signals, they start rewarding those who can operate without them. Power doesn’t corrupt…it collects where correction can’t reach.
    11. Collapse is a positive feedback loop. Every missed signal makes the next one easier to ignore. Like turning up the music to drown out that weird noise your car’s been making. Like watching a field fail year after year and blaming the weather…while doubling down on herbicides. The more insulated you are, the more in control you feel…right up to the moment the wheels come off.
    12. The sensitive fall first. We break down in response to signals others no longer perceive. We scream or cry at the news while everyone else shrugs and scrolls. We burn out while they call it “business as usual.” But our suffering is timely, not excessive.
    13. Our breakdown gets framed as the problem. Systems that depend on silence treat sensitivity as a threat. Call out harm? We’re unstable. Refuse to adapt? We’re defiant. Break down? We’re disordered. Say it’s too loud to think? We have attention issues. Easier to medicate signals than fix systems.
    14. Try to bring feedback back in, and the system pushes you out. Telling the truth is disruptive. Showing distress is personal failure. Refusing to play along is insubordination. Whistle blowers are prosecuted. Protestors are kettled. Burnout is a performance issue. The system’s fine with collapse…unless you name it out loud.
    15. In polite systems, feedback doesn’t get crushed…it’s ignored with a smirk. We’re not punished, we’re “too intense.” We’re not silenced, we’re just “not a good fit.” Say something real and we’re laughed at, labeled unstable, dramatic, extremist, naïve. We’re reduced to identity (“just a kid,” “just a woman,” just autistic,” “just rationalizing failure”) and treated as if we’re making people uncomfortable, not making sense. Greta stood in front of the UN, said exactly what needed to be said, and got turned into a punchline. If we can’t be diagnosed, we’re mocked. If we can’t be mocked, we’re ghosted. In systems built on image, truth is just bad optics.
    16. As civilization increasingly rewards disconnection, the more power flows to the least sensitive. This is part of collapse’s positive feedback loop. The people rising to the top of institutions are those least responsive to feedback, while the people most responsive to it are burning out in classrooms, boardrooms, and waiting rooms. One side gets elected. The other gets diagnosed. It’s not just misfit…it’s systemic inversion. The people who feel what’s wrong are told that feeling is the problem. We’re difficult. We’re rigid.
    17. The sensitive don’t go numb. Not because we’re defiant, but because we’re still connected. Neurologically. Physically. Emotionally. What looks like defiance is just coherence in a system that can’t tolerate it. But we’re not rebelling. We’re responding.
    18. To survive, we’re asked to suppress our perception. Masking, burnout, and self-ostracization become survival strategies. Not for thriving, but for staying tolerable to others. We start to believe that the problem is us. The traffic isn’t too loud to think, after all. I’m just difficult. The flickering fluorescent lights aren’t too bright, after all. I’m just too sensitive. As systems drift further from reality, so does the gap between what we feel and what we’re told. That gap has a name. It’s called suffering.
    19. Our suffering is the last internal signal the system still returns. When all other loops are broken, our distress is the only thing left telling the truth. Exhaustion means stop…not toughen up. Lies mean not-truth…not colors. But the system calls it a malfunction.
    20. The system can’t hear us. It reads accuracy as instability. Refusal as defiance. Collapse as personal failure. It doesn’t register signal…only disruption.
    21. Collapse isn’t sudden. It’s the final message from every signal the system refused. Every warning mocked. Every breakdown misread. Every truth sidelined. Dry wells. The teacher who quits mid-year. The kid who stops talking. They weren’t disruptions…they were course corrections. Collapse is the feedback that happens when you silence all the others.
    22. What the system calls dysfunction is often diagnostic. Autistic shutdown in a world of meaningless activity. ADHD “hyper”activity in environments devoid of species-appropriate novelty. “Pathological demand avoidance” in the face of relentless, arbitrary demands. “Hyper” fixation in a culture that interrupts everything. “Rigidity” in a world cut off from natural cycles. These labels don’t describe us. They describe conditions. Conditions that no longer support life.
    23. Collapse is never a glitch. It’s the return of feedback in force. What got silenced comes back louder. What got ignored shows up everywhere.
    24. Our distress isn’t a flaw. It is the cost of staying real in a system that rewards denial. Not by choice, but by the configuration of our nervous systems.
    25. Civilization unfolds as an amplifying oscillation between feedback severance and forced return. Pick up a history book. Each time it suppresses feedback, the eventual correction comes with more force, more velocity, less predictability. Like pushing little Timmy on the swing: each shove sends him higher, and each return is faster, harder to catch, more dangerous to stop. Each push moves the system further from coherence, until collapse is not a break, but a long-overdue arc completing itself.

    “Life depends on feedback.”

  • Civilization as a Process

    I’ll try to sell you on my redefinition of “civilization.”

    I don’t use the word to mean culture, or cities, or institutions (per se), or human flourishing. I use it more like a verb-process—like pacification, colonization, industrialization. Something directional, something that happens to people and places, rather than something they just are.

    It’s a pattern.

    To me, it’s what emerges when a group starts suppressing feedback loops…not necessarily out of malice…out of a desire to feel safer, more stable, more in control. It starts with buffering risk, avoiding discomfort, stretching growth, the usual. And at first, those choices help. Of course they do. They solve short-term problems. But the structure that builds around those solutions eventually starts to depend on not feeling.

    The system grows by keeping certain signals out. Overriding ecological cues, social tension, moral contradiction, bodily distress. The more successful it is at doing that, the more vulnerable it becomes when feedback inevitably returns.

    Whether through collapse, revolt, exhaustion, or ecological breakdown…whatever was suppressed / severed doesn’t disappear. It just builds up behind the dam. You see this clearly in human-driven desertification, for example, but also pretty much ANYWHERE this “civilization” process tends to wander (including in your own body…not listening to signals long enough and having that feedback return all at once as cancer, diabetes, etc.).

    So the pattern becomes this kind of oscillation: first, the severing of feedback, then the return of that feedback in the form of collapse. Then the rebuilding (new tools, new methods, maybe even new ideals), but the same structure at the core…suppress the signal, preserve the behavior.

    Each cycle gets a little more elaborate. A little more buffered. A little more ambitious. Of course it does. It’s able to build on the previous iteration’s feedback severances. Rome builds all kinds of cool shit. Rome collapses. But we don’t need to reinvent its successes. We pick up where it left off.

    When it breaks, it breaks harder. Every time. Because the feedback loops that were broken were bigger ones. More crucial ones. And they were severed for longer. More effectively.

    It’s not a linear rise-and-fall story. It’s more like an amplifying spiral…same pattern, but each swing goes wider, each crash digs deeper. Pushing a kid on a swing….every push goes higher, is a little easier, and comes back stronger.

    That’s why I don’t see “civilization” as the inevitable endpoint of human social evolution. It’s not the natural form of scaled human life. It’s just one possible configuration. But it’s the one we’re in, which makes it bloody hard to question. I think it was Shaw who said patriotism is believing your country is the best because you were born in it? Civilization as the best (or the only) because you’re in it. Presentism, or something.

    There are other ways groups can grow. Other ways people can organize complexity. Obviously. Every group in history that lived adaptively but wasn’t part of this process I’m talking about is saying “duh” from the pages of old books and in the oral traditions of their descendants. Ways that don’t require suppressing sensation, displacing consequence, or overriding the living world.

    This process….this civilizATION process…isn’t the default. No one I know would actually do the things they let civilization do for them, not with their own hands. So this pattern/process is a divergence. And any living thing still sensitive to real feedback becomes a divergence to IT. Necessarily. And the more it diverges from feedback, the more of those living things seem divergent within it. But they didn’t diverge. It did. Christ, I really managed to make that confusing, didn’t I? It’s late.

    Anyways, if you can start to see civ that way…not as some culmination of humanity, but as a particular coping mechanism that’s gotten out of hand, it becomes a lot easier to realize its explanations for things like cognitive divergence are just….ass-backwards. It’s not somewhat contextual…it’s delusional. I don’t expect you to be convinced…I’m still developing the language for this (and the ideas themselves, frankly). But think on it, maybe. Test it. I walk around seeing feedback loops now…where they’re broken, why, and what and who that affects.