Franz Kafka, a lawyer by training and insurance clerk in imperial Vienna, was a keen observer of the administrative mechanisms of his time. His novels, letters, and journals describe what happens to a world where the organization of things takes precedence over understanding people, where procedure replaces dialogue, and where individuals, summoned to comply with incomprehensible injunctions, no longer know whether they are subjects or suspects.
A century later, as artificial intelligence finds its way into the workings of businesses, administrations, and platforms, Kafka is more relevant than ever. We thought that AI would simplify tasks, streamline processes, and reduce cognitive load, but in many cases, it has instead added a new layer of opacity, a mechanism that decides without explaining, sorts without knowing, and applies without trying to understand.
What would Kafka say about these algorithms that punctuate and even dictate our daily lives? What would he recognize in our automated systems that he has repeatedly pointed out: hyper-formalization, the dispossession of “users,” and this machine that seems to create powerlessness in the name of efficiency?
So I imagined asking him the question.
In short:
- Kafka already observed a world where procedures replaced dialogue and where individuals found themselves lost in the face of incomprehensible injunctions; artificial intelligence extends this logic by automating decisions without clear explanation.
- AI, which is supposed to simplify tasks, often introduces hidden complexity: forms, implicit rules, invisible decisions, which make systems opaque and difficult to challenge.
- The supposed objectivity of algorithms is called into question: data are approximations, and their use transforms individuals into mere variables within a system that judges without listening.
- Humans do not relinquish power; they are dispossessed of it: they can no longer challenge or engage in dialogue with a machine that makes decisions without a face or identifiable responsibility.
- The fluidity promised by AI also eliminates the friction essential to debate, protest, and humanity itself, establishing a bureaucracy without walls where efficiency takes precedence over expression and meaning.
You have described worlds where people get lost in administrative labyrinths. What do you see today in these systems that integrate artificial intelligence to “simplify” the lives of citizens or employees?
I don’t believe that simplification is what we are really looking for. It’s a facade, a pretext that masks a desire for absolute order. But absolute order is violence in disguise. What I see in your intelligent machines is less simplification than a hidden proliferation. They say that a digital form replaces a queue, but it generates a thousand lines of code, cross-checks, and implicit rules. We remove a counter and create an entire architecture of invisible decisions.
In The Trial, K. does not know what he is accused of. But he receives summonses, he must respond, he must obey. Your algorithms do the same: they do not explain themselves, they notify, they call without saying why. And we feel summoned, not by a human being, but by a logic that has no name, no face, no materiality.
But isn’t it different today, insofar as we talk about data-assisted decisions, objectivity, and the reduction of human bias?
It is dangerous to believe that objectivity is a remedy. This word has become an automatic excuse that dispenses with thinking. Data is not solid ground: it is traces, approximations, mathematical memories. When a system relies on it to judge a person, their potential, their loyalty, their value, it transforms that person into a variable.
In The Penal Colony, the machine engraves the sentence on the condemned man’s body without him knowing it, because it assumes he is guilty because he is there. Your AIs do the same: they grade, they sort, they warn. They write invisible judgments in the file of an employee or an asylum seeker. They don’t listen, they apply.
This ties in with the idea of dispossession. It is often said that humans are losing control. Is that how you feel?
No, they are not losing control. It is being taken away from them, then given back empty.
The most unbearable thing is not losing the power to decide, it’s no longer being able to challenge. You can make a mistake if you’re dealing with someone face to face, but if the decision comes to you facelessly, if the refusal is signed by a statistical engine, who do you write to? Where does your letter go?
The humiliation doesn’t come from the refusal itself, but from the silence that follows it.
Some businesses claim that AI makes processes “smoother” and “removes friction.” Do you think this is a good thing?
Friction is not always an obstacle. Sometimes it is what allows a conversation to take place or an idea to be born. What you call fluidity is often a removal of resistance, and without resistance, there is no longer any humanity. A button to solve everything, an interface that anticipates your request, a score that precedes you: all this creates a smooth surface where you can no longer hold on.
In The Castle, the surveyor tries to contact the administration. But the further he goes, the further the system retreats. There are no doors, only mediators, interpreters, and hallway gossip. Your era has replaced these mediators with scripts, but the effect is the same: the decision is there, but we don’t know where it comes from. It has been made, but we don’t know by whom, where, when, or how.
AI as a new form of invisible bureaucracy?
Yes, a bureaucracy without walls. Faster, more efficient perhaps, but even more elusive. In the old world, at least, you could get lost in the corridors. There were faces, blunders, pauses. Here, everything is immediate, and in that immediacy, there is no longer any room for complaint, or even surprise.
And yet, many hope that these tools will free up time and reduce mental load… Isn’t that a legitimate aspiration?
It is. But we must ask ourselves: time for what? If we simply reinvest the time we save into a system that constantly evaluates us, then it is not a liberation, but an increase in pressure. We are freed from one task only to be measured on another, and productivity becomes a moral imperative.
I am not against machines. I am against the idea that they know what a life is worth.
To answer your questions…
AI does not actually simplify procedures, but rather shifts complexity behind invisible mechanisms. Whereas traditional bureaucracy still had faces and interlocutors, the automated version strings together implicit rules, codes, and opaque decisions. Users receive notifications rather than explanations and are confronted with abstract logic. This transformation reinforces a feeling of powerlessness similar to that of Kafka’s characters, who seek to understand a system while being caught up in it.
The apparent simplification masks a technical proliferation that is difficult to see. A digital form may appear simpler, but it relies on layers of checks and decisions that are completely hidden from the user. Algorithms notify users without explaining why, leaving them in doubt or under pressure. This lack of visibility makes the procedure more opaque than before and reproduces a Kafkaesque logic where people obey without understanding.
Dispossession comes not only from the loss of control, but from the inability to challenge. A human decision still allows for discussion, but an algorithmic rejection leaves no room for negotiation. The article emphasizes the “silence” that follows the decision and increases the feeling of humiliation. Without anyone to talk to, individuals no longer know where to take their complaints. This lack of recourse becomes a central feature of automated bureaucracy.
The objectivity invoked is misleading. The data are only approximations that do not capture human complexity. Used to sort, evaluate, or judge, they transform people into variables, as in The Penal Colony, where the machine applies a sentence without listening. This supposed neutrality can legitimize unquestionable decisions, reinforcing invisible judgments that affect employees or applicants without them being able to understand them.
There is hope, but there are limits. The time saved may be reinvested in a system that multiplies evaluations, increasing pressure rather than alleviating it. Productivity then becomes a moral standard, rather than a benefit for the individual. This dynamic transforms the promise of relief into intensified constraints, raising the question of the real use of this supposedly freed-up time.
Image credit: Image generated by artificial intelligence via ChatGPT (OpenAI)







