Sun, Mar 29, 2026 11:50

Mar 29, 2026 · 1 min read
post

Claudio Nastruzzi wrote an intriguing article in The Register on the reasons why AI-generated/edited text is potentially becoming generic and uninspired. In the article, he offers the term “semantic ablation” to explain the “algorithmic erosion of high-entropy information”. This phenomenon leads to writing that is literally mediocre, akin to a “JPEG of thought”. This analogy suggests that while the text remains comprehensible, it has lost its original depth of information. Here is an interesting quote:

When an author uses AI for “polishing” a draft, they are not seeing improvement; they are witnessing semantic ablation. The AI identifies high-entropy clusters – the precise points where unique insights and “blood” reside – and systematically replaces them with the most probable, generic token sequences. What began as a jagged, precise Romanesque structure of stone is eroded into a polished, Baroque plastic shell: it looks “clean” to the casual eye, but its structural integrity – its “ciccia” – has been ablated to favor a hollow, frictionless aesthetic.

Wouter Van Rossem
Authors
Wouter Van Rossem is a researcher on the intersection between social science and computer science. He previously worked on the European Research Council (ERC) funded project, Processing Citizenship, where he investigated how data infrastructures for population processing co-produce citizens, Europe, and territory. He completed his PhD at the University of Twente in the Netherlands and is still working on publications stemming from these impactful projects. In addition to his academic pursuits as a PhD at the University of Twente in the Netherlands, he brings a diverse background as a software engineer, having worked in various companies and at the European Commission’s Joint Research Centre in Italy. His diverse background, spanning both theoretical and hands-on knowledge, reflects his keen interest in exploring the intricate interconnections between technology and society.