When tools designed to explain the world become the rules that govern it, art, creativity, and freedom are at stake. Are we doing with Artificial Intelligence what we once did with Aristotle’s Poetics?
An old story: when describing became prescribing
In literary history, Aristotle’s Poetics is one of the earliest and clearest examples of how an analytical tool can be misinterpreted as a mandate. Aristotle wrote that treatise to describe how tragedy functioned in his time—not to impose rules on future creation. Yet centuries later, especially after the decline of Greek hegemony, his work was adopted as canon. For generations, artists no longer created freely; they wrote according to what was expected of them—according to Aristotle.
AI: a new Poetics for the 21st century?
Today, we face a technology that dangerously mirrors that same drift: Artificial Intelligence. Though originally conceived to assist, enhance, suggest, or analyze, it is increasingly treated as an authority. Algorithms define what we see, what we read, what is deemed efficient or creative. Like the Poetics, AI was created to describe; and like the Poetics, it is beginning to dictate.
No philosopher or emperor is imposing it on us. We are doing it to ourselves—out of convenience, social pressure, efficiency, or fear of being left behind. And with each “recommendation,” each auto-generated text, each decision made by a system trained on past data, we surrender a little of our capacity to choose, to question, to create against the grain.
The vertigo of not being able to opt out: techno-exclusion and update anxiety
In this context, a shared but rarely verbalized feeling emerges: that not using technology is no longer an option. That disconnecting means “falling behind.” This phenomenon—known as technological FOMO (Fear of Missing Out)—intertwines with a modern form of technophobia through exclusion, where the fear doesn’t stem from using technology itself, but from the risk of being left out of the social, professional, or cultural system by not adopting it.
It’s a form of adaptive pressure from the digital environment, reinforced by the network effect: the more people adopt a technology, the more costly it becomes to abstain. Just like writers of the past who felt they couldn’t ignore Aristotelian rules if they wanted to be taken seriously, today’s professionals, creators, and businesses feel they can’t refuse AI without risking their place in the contemporary ecosystem.
That illusion of inevitability is perhaps the clearest sign that a tool has ceased to be a tool—and has become a rule. And rules, when accepted unconsciously, stop being allies and become cages.
Epilogue: Looking again at the instrument
The lesson of the Poetics is urgent and clear: do not confuse analysis with mandate. Do not confuse efficiency with truth. Do not confuse a powerful tool with a moral compass.
Artificial Intelligence can open extraordinary paths. But its power should not eclipse our freedom to create, to dissent, to disconnect—and to look again, unfiltered, at the world beyond the algorithm.