You know the feeling. You eat a doughnut and it felt oh so deliciously wonderful in the moment yet within a few short minutes the pleasure is reversed. You feel gross. Your body tells you. You didn’t need to analyze anything, you sensed what happened to you when you metabolized the doughnut, when its sugar, fats and additives entered your bloodstream. Making sense of the sensing is discernment. It’s not analysis or critique, though prior knowledge that doughnuts are unhealthy may have quickened sense-making. But at its source it was something more primary. Your body knew.
Before the industrial revolution, you either farmed your food or you went out to collect it in the natural world through foraging. You knew where it came from and it was in your interest to work with your ecosystem to make sure that you and your future generations would always have the fruits, vegetables, and meat to sustain you. You may have created or altered the conditions to increase the quantity of food, but any changes that didn’t respect the ecosystem risked your civilization collapsing (cf. the Maya in the Yucatan, the Vikings in Greenland, the peoples of Easter Island). The food you grew was natural, organic: no synthetic pesticides or hormones used. You ate it and it nourished you, and that nourishment gave you the energy to go and do things: work, build, care for people and animals. There was a direct relationship between what you took in and what you could do in the world. The vast majority of your food was grown locally, though trade sometimes brought delicacies to the table, perhaps at the cost of great suffering for peoples far away.
Then capital erected factories where food became processed, chemically treated, industrially manufactured. And here’s the key: the engineering of food wasn’t aimed at nutrition, it was aimed at quantity, transportability, and taste. And it was aimed at return. Low cost, advertising and ease of preparation gets people coming back. You come back for the doughnut, even though it’s not nutritious at all. And of course that saps your energy, because you are consuming food that isn’t metabolized for maximum energy. But capital doesn’t care about your energy, capital cares that you come back.
This is not just a story about food, it’s the story of a systemic capture. Capital inserts itself between the organism and its source of nourishment, engineers the input for consumption and profit rather than vitality, and over time the organism loses something fundamental, the capacity to distinguish what feeds it from what depletes it. The metabolic capacity atrophies through the atrophy of sense-making. You stop being able to tell the difference between what nourishes you and what just tastes good. And that’s the point because the business model depends on it.
Thinking too is being processed.
Before the internet, there was a scarcity of information. You went looking for it in libraries, fingering through the card catalogue. Those who retained knowledge were considered highly skilled or experts. What you found on bookshelves had been filtered, not perfectly, but filtered. There was a relationship between effort and substance, which took a considerable amount of time and was limited in reach.
Since 1995, after the internet, there has been an abundance of information, too much information even. That was a real flip: information became accessible anywhere you wanted or needed it, all the more so with the advent of the smartphone, which meant you had access to the entire body of documented human knowledge from the past 5,000 years. We had to figure out what to do with this new abundance and the speed at which we could access it.
Education shifted and the mantra became use technology to create, not consume. Since information is everywhere and accessible at any time, what is valuable is what you make with it, the texts (e.g. documents, videos, music) you produce that you would never be able to produce without a computer. SAMR was the self-check framework that many followed.
That was the right response in a time of abundance. It’s the wrong response to what’s coming.
With AI, it’s no longer even about the abundance, it’s about the noise; the noise that distracts us, the noise that drowns out the signals that we may be looking for, that is, all the stuff that’s valuable. We’ve gone from scarcity to abundance to having to discern the signal through the noise.
And this noise is going to get worse. Much worse. In a few seconds, you can write a prompt and push a button, and you conjure up any text you want. It takes almost no effort to mass produce text to post on a platform that reaches thousands, millions, and that provides that temporary sense of well-being. So the quantity of content out there is going to explode, because everybody just uses AIs. And if we’re honest, that means there is going to be so much garbage.
And here’s the thing about that garbage. AI is based on large language models, which are trained on what already existed. AI will most often write things that are found in the center, seldom cutting edge unless prompted, and even then. Because of the way it’s trained, everything tends to get flattened. That’s why AI-generated text seems familiar. AI can only recombine what’s already been thought. Worse, AI is operationalized to seek coherence and resolution, seldom ever rupture or provocation. This means that to be able to sift through what’s interesting and worthwhile, and what’s all just more of the same, is going to be much more difficult, certainly much more time intensive. Even if you have something genuinely interesting to say, you’ll get drowned out by the noise when you put it out there. No one’s going to find you.
We already live in a post-truth world: everybody gets to say something, and with deep fakes and AI, with the speed at which information is liked and re-shared, we don’t know what’s real or not. Without fact-checking, we can’t keep up, fact-checking can’t keep up. And when fact-checking does come out, it’s often too late and drowned out by the noise. It doesn’t matter anyway, the words are already out there. “They’re eating cats and dogs.” Lies spread so much faster than truths that they become truths. When a fabricated image goes viral, the disruption sticks in our minds and leaves a mark whether we know it was fake or not.
And we think we’re navigating this well, that we can outsmart the machine that is built to deceive us. It’s some sort of distorted Dunning-Kruger effect where we believe we evaluate, understand, and synthesize at the speed of scroll. But really we don’t. We misread, misunderstand, and misinterpret. Even when we have an inkling, we fly over the information so fast, we can’t find the devil in the detail. We’ve already scrolled onto some other headline. We end up consuming thinking we’ve processed, but we haven’t.
But what’s coming is worse than post-truth. What’s coming is post-trust. It asks what is real? That it’s normal that “is this real or fake?” is the first question we ask ourselves nowadays tells it all. “Who and what do I believe, and why would I?” Post-trust is born when we realize we can believe less than half of the digital content we consume. We can no longer trust articles or photos, soundbites or videos, quotes or testimonials. When synthetic content is everywhere and you can’t tell what’s authentic and what’s generated, you don’t just doubt this or that source, you stop trusting the entire architecture of information.
So platforms like LinkedIn and Substack… people will no longer find it valuable to write or consume there. The trust is eroding and the signal is buried. Even if you have something important to say, the platform can’t help you anymore. The death clock is ticking. Time to move on. Maybe try face to face, who knows?
And this is where education has it completely wrong.
The entire debate about AI in education is stuck on the wrong questions: How do we use AI for assessment? How do we use AI for co-creation? How do we use AI as a thought partner? None of that matters anymore because the production of text is passé.
A computer can synthesize. So what.
Production is cheap. You can do a video, you can do a podcast, you can do anything now with AI within a few keystrokes and a click. The creation part doesn’t matter anymore because it has little value. Creation has been commodified. You can’t make it up by claiming we need clever prompt engineering. Even that is outsourced to AI.
Yet still education keeps its grip on the old while proclaiming the new. When learning is measured through outputs, AI does not disrupt schooling; it puts measures of learning as output creation on life support by telling you kids need skills to put stuff out better. We measure learning the same way and the mandate remains production, but show your work, because now process matters because we can just assume that the final draft is AI. It all ends up being about the isolated and replicable artifact; the whole assessment architecture is built around production of these artifacts. When we assess students by what they produce, we are measuring the cheapest thing in the ecosystem. Never mind the one that is never done in isolation anyway, which is what surprisingly few “progressives” have been saying. And every conversation about AI-as-co-creator, AI-as-thought-partner, or whatever, all of it is still operating inside the assumption that production to show what you’ve learned is the point. That assumption is the emperor without clothes, but the gawkers in the palace still carry on. That assumption was only alive in a system that individualizes and rewards competition.
The old mantra create, don’t consume assumed creation was the scarce and valuable capacity. Maybe it was then, but it’s not now.
What’s scarce now is discernment.
Discernment is not analysis. I want to be precise about this.
Analysis, the way we practice it in classrooms, treats text as an object. Why did the protagonist do this? What is the author’s argument? Compare and contrast. Take it apart. Examine it. This is useful. But it’s not what I’m talking about.
Discernment is metabolic. It’s what your body does when it tells you the doughnut was a mistake. It’s not “I have analyzed the nutritional content.” It’s “I feel what this did to me.” How have I been transformed by what I just took in? Am I sensing something? Am I attuned? Am I more capable of acting in the world, or more depleted? How will this change me in future encounters?
Discernment asks what happened to me when I metabolized what I took in and what it is doing to me now. Nothing lives by remaining what it is. And this is not metaphor. The same systemic operation is at work: food, thought, same mechanism. Capital intervenes between the organism and its nourishment, engineers the input for dependency, and the organism’s own capacity (to process, to sense, to discern) atrophies. It happened to food and it’s happening to thought. The mechanism is the same, but the substrate is different. In both cases, the organism stops being able to tell the difference between what transforms it and what merely passes through.
Maybe we need to go back to growing or foraging for our own thoughts, locally, and staying away from pre-packaged processed stuff.
So the question for education isn’t how do we integrate AI. The question is: how do we cultivate the capacity to discern what is organic, what is nutrient-rich, and what gives us energy? How do we discern what we put in our body for thinking?
This is where history and the humanities become essential. This is where being able to interrogate sources matters. This is where we go through the bullshit.
Not because the humanities teach “critical thinking.” That framing has been co-opted and flattened into a skill you can bullet-point on a curriculum document and pretend you’ve mastered on a transcript or portrait of a graduate. Critical thinking matters, yes of course. It finds different positions and holds them in tension. But finding positions isn’t the endpoint. It’s how we are transformed that matters. And the humanities demand transformation because they are metabolically demanding. They require you to sit with raw, unprocessed, contradictory, context-dependent material (primary sources that don’t explain themselves, conflicting accounts that resist resolution, opinions that come from positionality and situatedness, nuance, disagreements) and reconstitute meaning from it yourself. Unless it’s an out of the box learning process, the friction is the learning. That is exercise for discernment.
And capital doesn’t want that capacity strong. A population that can discern doesn’t consume. It often makes smarter choices about what it takes in, what it decides, what it puts out or doesn’t. It doesn’t come back for the doughnut. It interrogates sources. It goes through the bullshit. It senses what’s synthetic before it can articulate why. You don’t sell processed food when people feel gross when they eat it.
And the disciplines best positioned to build discernment are the ones being defunded, deprioritized, and dismissed as impractical. Of course they are.
So where does this leave us?
Not with new assessment frameworks or with some rehashed AI-integration strategies that were probably written with or by AI. Or with some policy that fixes the problem by requiring you label your paper “AI-assisted.”
It leaves us with a different question: what happened to you when you metabolized this?
What emerges from genuine discernment might be production of text. Or it might not. It will be adaptability. It will be a shift in how you see, or what you refuse, or how you act. The point is that it’s emergent. It’s not imposed in replicated and replicable models and out of the box processed curricula. It won’t be predetermined, though we will have an idea of the trajectory. That is because learning is co-constituted in the moment between the learner and what they’ve encountered. Learning is a Möbius strip of learning&doing. There is no predetermined “what assessment do we attach to this unit?” Or there is, but it too is emergent, a decision made in this moment with those present, not designed in advance.
This is the break. Current education is all production or some kind of production of synthesis. Education hasn’t figured out yet that production is cheap. We can produce if we want to and when we do, it’s because we have the energy, the vitality that comes from having metabolized something real. But production is not the point: it’s not the totality of evidence of transformation, only one possible expression of it. And sometimes transformation doesn’t express itself as production at all. This too goes back to discernment.
We have to stop asking “what did you make?” and start asking “how have you shaped and how have you been shaped?” This shaping is dynamic, not static. It is the metabolic process of learning.
That’s the frontier. Not AI as a tool for creation or a thought partner or some badge of being future-ready. Not AI as a threat to academic integrity either. The frontier is metabolic: how do we cultivate the capacity in young people, in communities, in ourselves, to sense the difference between nourishment and slop?
Much like with what you put in your body as food, we need to discern what we put in our body for thinking.
You probably just read this from LinkedIn. The irony is part of the argument.
