The multilingual contents of the site are the result of an automatic translation.
 

 
 
 
 
 
Français
English
Français
English
 
 
 
View
 
 
 
 
 
View
 
 

Other sources

 
Saut de ligne
Saut de ligne

IFKHA MISTABRA or the doctrine of contradiction

Free reflections on C2
Operational commitment
Saut de ligne
Saut de ligne

On October 6, 1973, a coalition led by Egypt and Syria attacked by surprise territories occupied by Israel in what would be called the Yom Kippur War. The Egyptians in particular intoxicated the Israeli military intelligence services by taking advantage of one of their exercises near the border to carry out their attack. The Israelis neglected all indications of an imminent attack, bringing the country very close to total defeat before turning the situation around. This failure, as well as subsequent mismanagement, cost Israeli Prime Minister Golda Meir her job.


Israeli military intelligence had suffered a cognitive bias[1] the well-known confirmation bias. When this is at work, only information that confirms the starting assumptions and the dominant thesis is retained. Divergent elements are at best considered negligible, at worst false, the result of errors or dubious individuality.

The realm of heuristic biases

All organizations are subject to cognitive biases at varying levels. Armies are no exception, except that the consequences are sometimes paid with the price of blood or even the disappearance of a nation. The place where cognitive biases develop is mainly in the organization's operating procedures, what armies call doctrine.

Doctrine comes under different levels, some of which make it possible to identify the particular cognitive biases that are expressed there, and especially their sometimes dramatic consequences.

At the top of the spectrum, we find first of all what could be called the general doctrine, expressed in the founding documents of an army at a given time and which is often the result of a cultural and historical evolution much more than that of a technical evolution. Thus the cult of the pre-war offensive of 1914 (translated by the phrase, however, which is almost two centuries old: "the bullet is mad, only the bayonet is wise") is a goodexample of this.[2]) gave birth in 1940 to the pre-eminence of the defensive over the offensive. The trauma of trench warfare had a strong impact on the doctrinal thinking of the inter-war period, leading to a certain lack of interest in offensive capabilities and manoeuvres. This ability to voluntarily abandon a whole range of one's own knowledge is known as agnotology.[3]. This idea is understandable if we consider that the interpretation of history and public opinion have a heavy impact on the functioning of armies, which are in the domain of res publica, unlike a private company. In the same logic one can cite the complete abandonment by the Americans between the Vietnam War and the first Iraq War of any reflection on counter-insurgency[4]. Closer to home, psychological operations were abandoned in the French army at the end of the Algerian war.

In the field of intelligence, cognitive biases are also expressed in the way of considering one's enemy. Several errors exist in this approach, the best known being certainly the projection bias. This approach consists in projecting one's intentions and way of fighting on one's enemy to make him a double of oneself, considering then one's capacity of initiative and reflection as null. This is how, during the Second World War, the Germans covered their main tanks with zimmerit[5]. This material was intended to isolate the tank shielding to prevent the triggering of magnetic mines.

In reality, this innovation was useless because only the Germans used magnetic mines en masse, unlike the Allies. The Germans had "projected" their doctrine on their enemies.

This projection bias also applies to the way the material is used in combat. If the battlefield has always been the place where the "D" system is expressed, conventional armaments are also sometimes diverted from their traditional uses, causing real surprises in combat. In Somalia in 1993, Mohamed Farrah AIDID's militiamen used anti-tank rocket launchers to shoot down American helicopters.[6] who considered the ground-to-air threat negligible. Similarly, in 2015 a Daech-affiliated group fired an anti-tank missile from the Sinai coast at an Egyptian patrol boat with an anti-tank missile diverted from its "normal" use. This is what one of Murphy's famous laws illustrates with : "professionals are predictable but the world is full of dangerous amateurs".

Sometimes mere hierarchical pressure is enough to create a climate conducive to the application of intellectual blinkers. What is known as conformity bias, which consists in adopting the dominant thinking of one's organization, is all the more powerful as the chain of command tends to push out non-aligned individuals. The American General DEMPSEY said that the day he received his fourth star, one of his comrades made him realize that no one would dare to tell him the truth anymore. Marshal LYAUTEY expressed the same idea when he saw the brains shutting down when the heels clicked.[7].

Without being caricatural, and while it is important to be aware of the cognitive biases involved, it is also reasonable to recognize that an army's doctrine is necessary for its effectiveness. What organization could question its rules of operation every morning and hope to have a long-term vision? Without doctrine, without the ability to work together, there is no point in joint or combined combat. No confidence in its subordinates, and therefore no subsidiarity, which gives its effectiveness to adapting to combat. All the biases induced by the application of a doctrine aim at making intelligible a situation characterized by an infinity of data to take into account the war. It is for this reason that we can speak of the realm of heuristic biases.

Systematizing contradiction: redteaming and faulty toys

Doctrine is therefore necessary for the functioning of armies, but it induces cognitive biases. These biases must be eliminated or at least identified by developing the use of a systematic approach to constructive criticism.

To this end, the US Army has created the concept of the red team to develop this capacity to criticise cognitive biases at work in staff work. Although the term red team only appeared during the Cold War, this idea of systematic constructive criticism is actually much older.

After the failure of the Israeli intelligence services mentioned in the introduction, the latter created a new cell called Mahleket Bahara (the department of control, in Hebrew) which was quickly nicknamed Ifkha Mistabra. This expression literally meaning 'the opposite is true' comes from the Talmudic tradition, and more particularly from the rabbinic courts, the Beth Din. In this framework, if during a trial the ten judges agreed on the verdict and none of them could present any evidence contradicting it, the court was dissolved without a sentence being pronounced. It was felt that if no judge was able to contradict the decision taken, the work of the court was biased and therefore inefficient. This is why the tenth man rule is sometimes referred to as the tenth man rule.

In the same way in the thirteenth century, the Catholic Church created the office of promoter of the faith, also called the devil's advocate. The individual charged with this function intervened in canonization processes and aimed at preventing any error in the choice of future saints by systematically questioning the miracles of the canonized potential.

Taking up this idea of reducing cognitive bias in the decision-making process, the American army created during the Cold War a red team cell intended to "reduce cognitive bias in the decision-making process".e to question the validity of the hypotheses attributed to the enemy, represented in red in the staffs, hence the name of this cell. However, it was not until the early 2000s that the American armed forces standardised the organisation, goals and functioning of the red team by creating a training organisation dedicated to this issue, called the University of foreign military and cultural studies (UFMCS).

Because of the origin of red teams, they are often confused, even today, with cells for in-depth analysis of the enemy, a sort of "super" intelligence office, which is not the case even if a red team can contribute to the analysis of the enemy. In the same way, the red team is not the cell of the evil spirit or of the "vent", as the collective imagination can conceive it.

The definition of a red team by the UFMCS is the following: "a group of experienced, trained and educated individuals, providing an independent capacity to conduct critical analysis from an alternative point of view". It is therefore a question of questioning the validity of a decision-making process, or its outcome, through constructive criticism based on differing presuppositions.The aim is to detect and eliminate the consequences induced by the cognitive biases unconsciously generated by the staff.

The red teams have three approaches for carrying out their missions. The first, and most well-known, is simulation. It consists of a confrontation between the behaviour of allied and enemy forces in a wargaming151 ,a confrontation that finds its culmination nowadays in CAX, computer assisted exercises and computer-assisted simulations.

Probe shots are the second tool used by red teams. A sounding is a full-scale attack carried out by an opposing team that aims to shed light on the vulnerabilities and flaws of a unit. The most classic example is that of a group of hackers legally testing the computer networks of a particular entity. By extension, training in open terrain against a real opposing force (FORAD), is also similar to the concept of a probe.

The last tool, the most difficult to implement, is that of alternative analysis. The aim is to carry out a reflection parallel to the classic decision-making process by adopting a different or even opposite point of view. This reflection aims to neutralize conceptual rigidities by resisting the pressure of the group, which is why the red team has to function by being outside the group it supports. The work of the red team can highlight the biases of the group's decision, but also confirm it by determining that alternative solutions are not viable.

108 A war game, the famous kriegsspiel invented by Von REISWITZ in the nineteenth century .

Thus, in 2007, the Israeli intelligence services transmitted to the United States photos of what they considered to be a nuclear reactor of North Korean origin in Syria in the locality of Al Kibar. Israel asked for American support to bomb this site, which is considered to be of military use. The CIA(Central Intelligence Agency) which discovers the existence of this site sets up a red team to assess the validity of this information. This group must determine the nature of the site on the assumption that it is not a nuclear reactor. In the end, this red team, despite its work, was unable to provide a viable alternative to the idea that it was a nuclear reactor, thereby confirming Israeli intelligence. However, the USair force did not bomb the site, because the red team was able to point out that nothing corroborated the military use of this reactor, especially since the United States was at the time scalded by its intervention in Iraq.

While the use of a red team allows for the improvement of decisions taken within the staffs, it is not always easy to set it up. In addition to the support of the chief to facilitate or even impose the existence of this cell, its operation requires special personnel. Indeed, this staff must be impervious to the atmosphere of the staff in which it works and also have the capacity to reason outside the conventions. A USFMCS instructor stated that members of red teams had to be autistic in the functioning of the staff in which they served, going so far as to speak of misfit toys, defective toys. These individuals, outside of the cadre, are generally frowned upon in the hierarchy, especially in peacetime.

The creation of a red team also poses more down-to-earth problems. By adding a step to the planning process, the red team is asking for more time, while speed of planning remains one of the keys to success. A red team also means additional staff for headquarters that are burdened by the management of information flows.

In the end, it is difficult to determine exactly when and where to set up a red team. In addition to the Israeli (for the intelligence aspect) and American (for the functioning of headquarters) examples already cited, we can also mention the British case. Set up at the DCDC (the Development, Concepts and Doctrine Centre, the think tank of the British Ministry of Defence), the British red team determined the prerequisites for its use and its effectiveness. First of all, the red team must have access to the highest level leader within the unit. The purpose of his or her employment is to improve the quality of the decision taken, not to validate it. Finally, the red team must be used upstream rather than downstream, because its mere use is not enough to render a decision already taken null and void.

Without respecting these elements, it is easy to neutralize the interest of a red team. These three errors were found as they were during the large-scale American military exercise Millenium challenge in 2002. This exercise aimed to validate the new doctrine of the American armed forces resulting from the revolution in military affairs, as well as to study a possible invasion of Iraq. The red team, which also played the role of the enemy (the red cell) in the simulated war game, sank half of the American fleet, including its aircraft carrier, in just a few minutes. After this event, the war game's arbitration cell (the white cell) ordered the red cell to restrict itself to the planned scenario only, and went so far as to restrict its offensive capabilities to ensure victory for the American invasion force. The red cell, therefore, no longer had access to the arbitration cell that was also conducting the exercise, and the purpose of the exercise was clearly to validate the new doctrine, not to amend it.

What about the French army? The concept of the red team is not yet very developed there. In addition to the lack of knowledge of its role already mentioned above, there is also the belief in the fact that the "French grumpy spirit" of the red team is not yet well developed.[8] "will overcome the collective blindness of the general staff. It is also interesting to note that there is a cognitive bias (French would have a more developed critical mind) to correct other cognitive biases.

Internal RedTeaming: Quantum Thinking?

Cognitive biases are not exclusive to groups. The dogmatism that is often at the root of these biases is also expressed in the individual before spreading to the group (especially if the individual is the leader or has a strong charisma). There is therefore an interest in developing a kind of internal red team, what the battalion commander DUBOIS calls a "pragmatic doubt" that aims at "confronting reality".[9].

In this context, it is paradoxical to note that the aim of a red team is both a way of reducing the uncertainty linked to decision-making and of accepting this same uncertainty as a non-reducible element. But developing this capacity on a personal basis is a long-term task. Moreover, this capacity to decide while being able to doubt pragmatically is almost like "quantum" thinking.[10]. This type of thinking, which allows you to think about both one thing and its opposite, is undoubtedly a form of internal red team. However, and like Schrödinger's cat[11]Both dead and alive, it is difficult to move from classical to quantum thinking. But it is probably a way to get away from the most common cognitive biases.

At a time when the use of artificial intelligence promises us decision support tools free of human error by relying on big data, the concept of red team may already appear obsolete. However, and apart from the fact that an artificial intelligence will always be dependent on the cognitive biases of its creator and the data necessary for its operation, the moral and political aspect of war guarantees the place of Man as the final decision-maker for a long time to come. Humility, doubt and alternative analyses still have a bright future ahead of them, provided we accept those who express them. General Petraeus also considered that the armed forces should "create a culture that preserves and protects the iconoclasts".

Bibliography :

Micah ZENKO, Red team, how to succeed by thinking like the enemy, basic books, 2015;

University of foreign military and cultural studies, the applied critical thinking handbook, version 7.0, 2015;

Serge CAPLAIN, Penser son ennemi, modélisations de l'adversaire dans les forces armées, Études de l'IFRI, focus stratégiques, 2018;

Emmanuel DUBOIS, Le doute, vertu fondamentale du militaire, Cahier de la pensée mili-Terre, 2018;

Biais cognitifs, Wikipedia article, Wikipedia, https://fr.wikipedia.org/wiki/ Biais_cognitif, 2018.

_______________________________________________________

1] "Cognitive bias is a distortion in the cognitive processing of information.

Source WIKIPEDIA .

Land Forces Doctrine Review

2] This sentence comes to us from the Russian General Alexander SOUVOROV (1730-1800).

3] Agnotology, a term invented by the historian of science Robert N . Proctor in 1992, is understood as the study of the cultural production of ignorance.

4] A U.S. instructor. Army's Command and General Staff College said to his students that they were not there to "wash dishes, wash windows, or do counterinsurgency.

5] Named after the company that produced it, Chemische Werke Zimmer AG .

6] The famous black hawk, the black hawks who will give their names to this episode.

[7] "When I hear heels clicking, I see brains shutting down. "Marshal LYAUTEY.

[8 ] Serge CAPLAIN, Penser son ennemi, in Etudes de l'IFRI, focus stratégique .

9] Emmanuel DUBOIS, Le doute, vertu fondamentale du militaire, in Cahier de la pensée mili-Terre .

10] Laurent HENNINGER, Lecture at the École de Guerre-Terre, September 2018 .

11] Erwin SCHRÖDINGER is one of the pillars of quantum physics and imagined an experiment to illustrate the latter: a cat is enclosed in an opaque box with a device that has a 50% chance of killing it. In classical physics, the cat can be either alive or dead, but in quantum physics, and as long as we don't check the inside of the box, the cat is both dead and alive at the same time.

Séparateur
Title : IFKHA MISTABRA or the doctrine of contradiction
Author (s) : Chef d’escadron Jean-Baptiste FARGEREL
Séparateur


Armée