Xinye Tao | Blog - Collection - Résumé | RSS
# non-fiction, 2026-03-16
In recent months I noticed a shift in my information feed. More and more people are starting to write, the essay world is booming in quantity. Yet, few are well-crafted in the traditional sense. In hindsight, this feels like a natural effect of AI-assisted writing: the barrier between idea and essay becomes so low, that the community begins to prioritize speed over quality, discovery over appreciation. But it got me worried about what else might be unfolding that we aren’t prepared for.
It is easy to draw resemblance to mass production of goods, which put an end to the artisan as a profession. Similarly AI is causing skill devaluation and junior crisis. At the forefront, software engineers have generally moved on from denial to agreeing that the future of programming will operate on a completely different paradigm. A role transition from craftsman to manager is required to get us there.
Mass production also changed our relations with everyday things, which is a potential preview of how we’ll approach thoughts differently. In the age of handcrafting, things are irregular. That gives them the capacity to hold contexts, to communicate intent of the maker, and to witness life of the user. Meanwhile, mass-produced items are homogeneous by design; they are clones of an abstract function, or symbols of a brand and style. They don’t have an origin story. Full of fabrication even if they do. And they rarely blend in with our life. We are secretly afraid of these alien species, and always jump at the first chance to dispose of them (for something new and unfamiliar), no regrets. In migrating to mass production, “thing” is reduced to a hollow concept of itself, severed from the people. That is a much more profound impact than job market shifts.
AI is different though, one may think. AI doesn’t do clones. Cloning is valueless in the digital realm. AI interpolates. And as AI’s operator, the human continues to stay in the loop by providing context - perspective, bias, personality.
Sadly, I don’t think that is the story we are in. I believe we are only going to be less and less specific in terms of thinking, therefore marginalized from the process.
There is always a tendency to be unspecific. For forming and communicating thoughts, specificity is expensive, something we’d love to do without. Human brain tries to be vague and succinct unless being forced otherwise. People working with coding agents should know, it usually takes 90% of tokens and time to pin down the last 10% of details. This is not a recipe for exponential growth. I anticipate we will overcome the hurdle eventually, either by finding ways to extract context without manual prompting, or/and by learning to manage and ship herds of under-specified ideas. (The latter is different from adding one layer of abstraction, because as soon as that added layer becomes well-defined, AI will be pushed to build on top of it.) In either case, specification is de facto outsourced to AI. We eventually interact with well-specified artifacts, but find it difficult to verbalize it without help from AI. (It is reminiscent of the relationship between cultural critics and mass population.)
But we haven’t fully indulged ourselves with that tendency so far, due to incentives that would be weakened in the future:
Reality is specific, full of details. Hence specific thoughts are also truer thoughts. The community rewards true thoughts because they are useful in practice and present a path to shared understanding. But as it currently stands, the world is becoming more divided than ever, one devoid of the concept of a common life. The amount of information being generated globally is simply too much to converge and be absorbed. Specific thought, faithful to a local reality, will resonate poorly with the wider audience. Vague thought, or fragment of a specific thought, is more likely to be compatible. One form of it is polarization and extremism; that we’ve seen enough of. Another likely form is for AI to break down raw ideas and tailor them for individual readers. In fact, we have also seen it in practice with short videos, the hallmarks of shallow thoughts. They are slop generated by human but isomorphic to the AI system that triages and promotes them. And precisely due to lack of depth, they are easy to adapt and conform to viewers of any kinds.
Specificity is also a signal of cognitive superiority. Thoughts with structures and nuances are harder to build and harder to comprehend. And in general, the cost of modifying a pre-existing thought is no cheaper than formulating your own. They are proofs of unforgeable intellectual work. But that is clearly no longer true with AI, which in addition to being a private tutor, is able to pirate any original thought and generate infinite variations of it, or simply build an equally complex one from simple prompts.
How will it feel when thinking loses focus and boundaries in between thoughts become blurred? Zygmunt Bauman used the word “liquid” to refer to the general quality of modernity. It is a powerful metaphor that can be borrowed here as well. Liquid thoughts no longer exist in singular, rigid forms, they are perceived and manipulated in a continuum, endlessly altered, distilled, and collaged. People no longer think to express an original view, and no longer listen to understand. The thing that’s left and will prevail is thinking as a political gesture, because conviction of thoughts or the lack thereof ultimately decides the shape of our physical world.
Every time in history, liquefaction has brought faster circulation. Like the line in 007: “The world is arming faster than we can respond.” People who find pleasure in deliberating, readers and writers of long-form, well-structured, coherent pieces, may soon be forced to accept a different landscape. And for that I am sorry.