Sun, Mar 29, 2026 11:50
Claudio Nastruzzi wrote an intriguing article in The Register on the reasons why AI-generated/edited text is potentially becoming generic and uninspired. In the article, he offers the term “semantic ablation” to explain the “algorithmic erosion of high-entropy information”. This phenomenon leads to writing that is literally mediocre, akin to a “JPEG of thought”. This analogy suggests that while the text remains comprehensible, it has lost its original depth of information. Here is an interesting quote:
When an author uses AI for “polishing” a draft, they are not seeing improvement; they are witnessing semantic ablation. The AI identifies high-entropy clusters – the precise points where unique insights and “blood” reside – and systematically replaces them with the most probable, generic token sequences. What began as a jagged, precise Romanesque structure of stone is eroded into a polished, Baroque plastic shell: it looks “clean” to the casual eye, but its structural integrity – its “ciccia” – has been ablated to favor a hollow, frictionless aesthetic.
