Logo en.artbmxmagazine.com

The fog of war. How to anticipate the inflection point in nonlinear events?

Anonim

Nassim Taleb (2007), in the chapter entitled How to learn from turkey, works on the problem of induction, “the mother of all life problems”; How can we logically go from specific cases to general conclusions? How do we know what we know? How do we know that what we have observed in given objects and events is enough to allow us to understand their other properties?All knowledge that has been reached through certain observation has certain traps built into it. " To illustrate this, Taleb takes up the everyday life of Russell's turkey: the turkey is fed every day, so each morning the turkey will increase its belief that as a general rule, it will be fed every day; The evening before Thanksgiving, something unexpected will happen to the turkey. What can the turkey learn from what will happen tomorrow based on the events of yesterday?

In everyday life, we tend to think Cartesian: we deal with things as if the world were a clearly and distinctly knowable simple system, we extrapolate the past to the future by making inductions that allow us to ensure a future that replicates what happened. That is, we devise a model; Once the system that guarantees our security has been devised, we respond to it with the inflexibility of a bureaucrat. After being fed by the farmer every morning of the year, the risk of being eaten is not part of our horizon of thought. Then the unexpected happens. And we go from eating to being eaten.

How to prepare for the contingencies of the system? How to alleviate the insufficiency of knowledge in the face of situations in which small incremental changes trigger excessive effects? What Taleb points out in his chapter is the fact that, in complex systems, induction fails with inordinate spectacularity; that is, it points to the exponential increase in bad decisions that lead to inductive thinking in a system governed by abrupt changes, unpredictability, improbable results and extremism. The problem is how to project in a system where the future does not follow linearly from the past, how to forecast on a horizon where inflection points make the line lose sanity.

Taleb –who is a renowned stock market advisor-, to graph such complex systems, cites the stock market, terrorist acts, failures in the electrical distribution network; Without a doubt, the realities of virtuality are the complex system through which we are permanently traversed; He also mentions "the unpredictability of war." All cases in which knowledge does not grow from the accumulation of information; Furthermore, the maxim that the greater amounts of information, the lower the comprehension skills seems to apply. Being informed does not guarantee being in a position to recognize asymmetries; many times, on the contrary, being over-saturated with indoctrinate information, only to meet symmetries, disabling the reading of unexpected events.

Industrialization, which increased the power of fire and the operational capabilities of armies, in addition to multiplying their logistical capabilities, contributed to the paradigm shift in the conception of war. The Napoleonic experience generalized the concept that it was entire nations (and no longer monarchs and their small armies) that fought among themselves. The emergence of vast battlefields and huge armies forces one to think through the complexities of the war system.The case of war may be the paradigmatic system for managing unpredictable things, and it will help us to think outside of linear logics. An analysis of some concepts of the theory of war could reveal the impasses to which the inflexibility of a guarantor model of action guidelines leads.

Clausewitz, at the beginning of the 20th century, theorized war in Europe, under the pretense of a "science of war". When it comes to “systematizing” the material data of war, a “science” of war can only work in two ways: o reducing superiority in war to simple numerical data (basing war on statistics); or proceeding by geometrization of one of the factors in play (the analytical privilege of one of its parts). Either way, the result of such a procedure will be worthless, Clausewitz concludes. In other words, when it comes to dealing with material non-linear event data, one-sided perspectives are useless. The key is to think about war; in complex systems, the key is tothink outside the model. Clausewitz maintains that until then, the reflection on the war has failed to have tried to establish a model where the events escape the establishment of any finished model. The model war (ideal) never reaches the real war (or virtual, we could add in our current situation). This distance between the ideal and the real (or virtual) is the point of Clausewitz's thought; to think war is to think how war betrays its model. To think of any non-linear event is to force thought out of its mold. The world of action is structured, in its logical forms, on laws; and the laws reveal themselves to be inapplicable to the variability of an unpredictable inflection. That is to say,Any formalization that implies repetition becomes a danger: the routine of those who are fed every day at the same time can lead to diligently obeying the master's hand, but in no case does it lead to thinking about the dangerous horizon that lies ahead. Thinking the unpredictable forces thought to move away from universals and exercise itself in that particularity that inflexes the system. In war, Clausewitz points out - much more than in everything else - things happen differently from what was foreseen, they take on a different look from close up than they did from afar. Because war is not an inert matter, but an event that "lives and reacts"; that is, an event analogous to an organism; a complex system, where only one of its parts can make the whole collapse.

Technology allows action at a distance, in a situation in which a specific decision can produce effects at distant sites –and at the same time- within the network. Until the Napoleonic wars, “the general or army leader could have at a glance the complex situation of his troops and those of the enemy on the battlefield, simply by setting up his command post at some nearby elevation and counting on the help of a spyglass ”(Pertusio, 2005: 117). However, after the Napoleonic wars, the situation can be understood as a complexity of the system: “In the First World War, General Joffre and his successors directed operations from Chantilly; the German General Staff from Luxembourg and Spa. In World War II, Hitler led the Battle of Russia from Vinnitza;Eisenhower controlled the Battle of Normandy from England ”(Masson, 1990: 221).In war, most of the time one acts from a distance and blind, without counting on the clarity and distinction of a simple system; constantly facing the unknown, the unforeseen, the uncertain; knowing that adversaries (and their own) can use the same networks to "blur the image of the other that they want to analyze" (Frasch, 2005: 64). It happens in war in a similar way to what happens in the stock market, the processes developed in virtual networks, companies and any other complex system.

Such a situation originates in the Commander in charge of the decision that is recognized in the analysis of the war under the metaphor of the fog of war. Faced with the impossibility of a "Cartesian" vision, the Commander must be convinced in advance that the decisive resolutions must be taken in uncertainty. Two crucial instances come into play here: risk and timing.

In principle, whoever is in charge of directing non-linear processes must be certain of the impossibility of control over the totality of the situation and, even, of the non-existence of such a situation as a totality, since they must start from two premises: the condition limited knowledge of the situation, and the recognition that the whole is not prior to the parts nor does it result from the sum of the parts(because a part of the situation has the power to build or destroy what is understood by everything). These characteristics tend to show the situation as a series of dispersed, independent and dissociated mosaics, to be assembled with unavailable information, achieving the tracing of an image that does not exist beforehand. That is, the instance of hazy vision is an inherent condition of the decision in complex systems, and corresponds to a cognitive blur that understands that the image will never have the clarity of a finished figure nor can it be recovered by joining its pieces, but must be created, exercising a thought that deliberates on the contingent. Anticipating the future in non-linear systems requires capacities closer to the virtue of prudence than to analytical knowledge; the estimation accuracy,the capacity for judgment and the vivacity to see in the mental fogs are not questions that can be calculated analytically or respond to a pre-drawn ideal, nor can they be resolved by accumulating information. It is necessary to dare to think outside of known models and incorporated beliefs, since all knowledge that has been reached through certain observation has certain traps incorporated. Not suspecting it is nonsense.for all knowledge that has been reached through certain observation has certain traps built into it. Not suspecting it is nonsense.for all knowledge that has been reached through certain observation has certain traps built into it. Not suspecting it is nonsense.

References.

  • Clausewitz, C. von (1976). From the war. Princeton New Jersey: Princeton University Press Frasch, CA (2005). The decision in chaos. Buenos Aires: Institute of Naval Publications, Technological Institute of Buenos Aires. Masson, P. (1990). From the sea and its strategy. Buenos Aires: Institute of Naval Publications.Pertusio, R. (2005). Operational Strategy. Buenos Aires: Institute of Naval Publications.Taleb, NN (2007). The Black Swan. Barcelona: Paidós Ibérica.

Perhaps the complete unpredictability of virtual networks is the cause of the difficulty of standardizing them.

The fog of war. How to anticipate the inflection point in nonlinear events?