AI skills: a catch-all term for a major misunderstanding

-

Today, everyone is talking about “AI skills.” They are everywhere: in job descriptions, training plans, and even in job interviews, where they have become an evaluation criterion that is as common as it is vague.

A few days ago, a friend told me that she had just been passed over for a job because she didn’t have the so-called AI skills. She wasn’t a data scientist, nor was she an engineer, but an experienced marketing director, well-versed in project management and data analysis. The remark left her perplexed: what exactly was expected of her? To know how to code? To understand how a language model works? Or simply to be able to say, without hesitation, that she “used AI”?

To me, this is a sign of unease. In its efforts to make artificial intelligence a must-have, the business has created a category that is as broad as it is undefined, a kind of semantic cloud that mixes machine learning engineers, ChatGPT users, and executives in search of modernity. It’s a reassuring category because it seems to say “we’re part of the movement”, but it’s dangerous because it no longer says anything concrete.

In short:

  • “AI skills” has become a vague and ubiquitous catch-all term, used indiscriminately in business, creating vague and sometimes counterproductive expectations, particularly for non-technical profiles.
  • Attempts at definition distinguish between technical skills (programming, algorithms) and usage skills (critical thinking, understanding bias), but each organization projects its own interpretations, which are often more symbolic than concrete.
  • For executives, AI skills should be strategic: it is not a question of adopting the tool, but of rethinking value creation and decision-making models, which few businesses manage to do fully.
  • For non-technicians, AI skills are more about human capabilities: discernment, responsibility, understanding of uses and their concrete implications in the business.
  • True AI competence does not lie in the use of a tool or mastery of the prompt, but in the ability to articulate humans and machines in a coherent system, translating intention into real organizational transformation.

For years, international reports and consulting firms have been trying to define what this concept encompasses. The OECD sees it as a combination of technical and behavioral skills: you need to know how to handle data, of course, but you also need to think critically, communicate clearly, and collaborate effectively (Skill needs and policies in the age of artificial intelligence).

Salesforce echoes this idea, distinguishing between hard skills such as mastery of algorithms, programming, and data set structuring on the one hand, and so-called usage or “AI literacy””skills on the other, in other words, the ability to understand the limitations, biases, and use cases of a tool (The 10 AI Skills You Need to Thrive in Today’s Job Market).

More recently, Anthropic has attempted an elegant synthesis: delegation, description, discernment, diligence. In four words, the essentials are summed up: knowing when and why to use AI, knowing how to formulate an intelligible intention for it, knowing how to judge the relevance of its response, and above all, knowing how to remain responsible for what we do with it (AI Fluency: Framework & Foundations).

But despite these efforts, I find that there is still a great deal of ambiguity because each organization projects its own level of understanding of the subject onto the term “AI fluency”. Some see it as technical know-how, others as a mindset, and still others as intellectual curiosity. But most, in truth, see it as nothing more than a prestigious label, something you have to display so as not to appear behind the times.

This week, I had some fun asking this question to five friends who work in HR or recruitment. Given the silence that served as their response, I get the impression that looking for AI skills in a recruitment process for non-technical profiles ends up being a real industrial disaster.

For executives, AI expertise is primarily strategic

Many executives confuse adoption with maturity. They think they have reached a milestone as soon as they have organized an acculturation seminar, launched a few internal pilot projects, or posted an AI ethics charter. All of this is useful, of course, but it often amounts to nothing more than a smoke screen: symbolic adoption without any real change at a deeper level.

AI competence for a leader is not about manipulating a tool, but about seeing beyond it and understanding that the real challenge is not to use AI, but to extract measurable value from it.

Managing AI is not about managing technology, it is about managing the change it imposes: the decisions it shifts, the trade-offs it redistributes, the value creation models it recomposes.

However, most businesses stop halfway. They do a lot to adopt AI, but little to take advantage of it, when the real challenge is to move through three stages: from adoption to productivity, then from productivity to value (Technologies sell productivity, but businesses want revenue and Who is handling your artificial intelligence projects? Probably not the right people.).

The first step, adoption, involves familiarizing yourself, testing, and communicating. The second, productivity, involves integrating AI into processes, adjusting decision cycles, and reducing operational friction. But the third, value, requires something else: a redefinition of priorities and a detailed understanding of how technology is changing the economic chain.

And this is where most people get stuck, because this transition from use to value is uncomfortable, as it requires measuring, prioritizing, and sometimes reinventing one’s offering.

Apart from technical profiles: the human side of artificial intelligence

For most employees, “AI skills” are not a matter of technological learning but rather a form of cognitive and professional maturity. It is the ability to work in an environment where machines reason without understanding, respond without thinking, and assist without judging.

Studies converge and highlight the vagueness of the concept and a lack of “methodological confidence” rather than technical skills (6 barriers to AI skills development). Teams know how to use the tools, but do not always know where to place them in their business logic.

In other words, AI competence does not lie in the manipulation of tools, but in the way we think about their use, in the clarity with which we connect intention, context, and result. These are human skills, skills of judgment, discernment, responsibility, slow skills that are acquired over time, through experience and confrontation with very concrete situations, not through mastery of a tool.

The fleeting myth of prompt engineering

In just one year, “prompt engineering” has become the symbol of this misunderstanding. It has been turned into a skill in its own right, a market, almost a profession. Training courses, “recipes”, and “magic prompts” are being sold that are supposed to increase productivity tenfold, but in reality, this is a transitional period, a zone of adaptation between two technological ages.

This fascination reveals less the power of the tools than our inability to grasp their internal logic. As long as interfaces remain textual and context is not natively understood, the art of prompting has a use, but it is destined to fade away as models increasingly understand the user’s intention, role, and profession and, above all, disappear behind everyday tools (Nobody wants to prompt).

What will remain, however, is the ability to translate an intention into something operational. Knowing how to say what you want, why you want it, and how you will judge the result. The rest, the syntax, the tricks, will fade away.

Ultimately, it’s a question of organization

If we strip away the hype, there is only one real issue left: how the organization coordinates humans and machines. AI expertise then becomes a skill in systemic orchestration: understanding how tools change information flows, how they redefine the distribution of tasks, and how they shift responsibilities.

By its very nature, AI forces business to ask themselves fundamental questions: what do we expect from humans? From machines? How can we ensure that the two complement each other rather than cancel each other out?

And that’s when we realize that we need a rarer skill, one that connects meaning, technology, and the collective, and that moves the discourse of innovation to the practice of change.

Bottom Line

There is a lot of talk about “AI skills,” but no one ever talks about what they really mean: not isolated expertise, but a collective ability to understand and drive paradigm shift.

As long as the word “AI” remains an adjective, as in AI skills, AI strategy, or AI leadership, it will continue to be perceived as a separate domain, an appendage of reality. The day it disappears from everyday vocabulary is the day it will finally have been integrated, and that is when we will know that we have reached true maturity (Technology is a word that describes something that doesn’t work yet (Douglas Adams)).

To answer your questions…

What does “AI skills” really mean?

“AI skills” refers both to technical mastery of the tools and the ability to understand their uses, limitations, and impacts. It is not just about knowing how to code, but knowing how to articulate the relationship between humans and machines. Each organization has its own interpretation, which is why there is currently some confusion. True competence lies in discernment, responsibility, and the ability to create value with AI.

Why does this concept create misunderstandings in recruitment?

Because many businesses demand “AI skills” without defining them. Good non-technical candidates are sometimes overlooked in favor of a buzzword. However, understanding how to integrate AI into a job is worth much more than knowing how to manipulate a model. Recruiting on this vague basis risks overlooking real skills that are useful in the workplace.

What AI skills are expected of a manager?

Pour un dirigeant, la compétence IA n’est pas de savoir utiliser un outil, mais de comprendre le changement qu’il provoque. Elle consiste à piloter la transformation, mesurer la valeur créée et intégrer l’IA dans la stratégie. L’enjeu n’est pas l’adoption symbolique, mais la maturité : passer de la technologie à la création de valeur mesurable.

Is “prompt engineering” a sustainable skill?

No, it’s a transitional phase. While useful today for interacting with models, it will disappear once AI better understands context. The real lasting skill is clearly formulating an intention, a need, and a result criterion. In other words, knowing what you want, why you want it, and how to judge the response.

How can businesses develop real AI skills?

By focusing less on technology and more on understanding. Train teams to think with AI, to connect intention, context, and outcome. AI skills become collective: knowing how to orchestrate human-machine interaction, redefine tasks, and reinforce responsibility. It’s primarily a question of organization.

Crédit visuel : Image générée par intelligence artificielle via ChatGPT (OpenAI)

Bertrand DUPERRIN
Bertrand DUPERRINhttps://www.duperrin.com/english
Head of People and Business Delivery @Emakina / Former consulting director / Crossroads of people, business and technology / Speaker / Compulsive traveler
Vous parlez français ? La version française n'est qu'à un clic.
1,756FansLike
11,559FollowersFollow
31SubscribersSubscribe

Recent