Yesterday I accidentally published this alongside my scenes post. Sorry about that. More non-DeLanda posts next week.
The last post on this topic attempted to define Deleuze’s three ontological dimensions by following DeLanda’s examples for the logic behind it. These three ontological levels:
- Apparent actual things with extensive properties (e.g. “metric” measurements)
- Morphogenetic processes with intensive properties (e.g. temperature, pressure, other variables with critical thresholds that can change apparent properties). The intensive science is complexity theory.
- Virtual structures of the morphogenetic processes (singularities defining the tendencies of multiplicities down in the level of the obscure/continuous/indiscernable). The virtual philosophy is Deleuze’s treatment of this ontological level.
Deleuze’s is a process ontology, not at all anthropocentric, and not connected to the mind at all- for Deleuze, reality is out there. And although it can be pragmatically acted upon, the virtual might be even more complex and nonlinear than simple inquiry reveals because we tend to linearize and focus on close-to-equilibium phenomena. The virtual itself is also not an eternal, transcendental thing- it is immanent, a part of the world, affected by events. More on all of this, below.
Some of these ideas were alluded to in A Thousand Years of Nonlinear History, another book by De Landa that I “read” last year (but generally didn’t understand all that well). It is about how intensive processes actualize various institutional forms. At some point I will have to return to that particular mountain.
This post is on the second chapter of ISVP, The Actualization of the Virtual in Space.
DeLanda (this is how I’ll write his name by now on?) characterizes intensive thinking with populations and rates of change, to replace “the two main features of essentialist thinking: fixed types and ideal norms”.
Species is not a higher category than individuals. Species clearly live longer and take up more space, but are essentially an entity just like an individual organism (besides, the “individual organism” is itself a collection, an assemblage of foreign organisms and distinct systems).
Geneticist Ernst Mayr:
The ultimate conclusions of the population thinker and the typologist are precisely the opposite. For the typologist the type (eidos) is real and the variation an illusion, while for the populationist, the type (the average) is an abstraction and only the variation is real. No two ways of looking at nature could be more different
To a “typologist”, homogeneity is natural (adherence to some pre-existing form) and heterogeneity is an “accident of history”. The population thinker, by contrast, sees differences as the engine of the evolutionary process- forms of organisms evolve within collectivities (or reproductive communities). So a group of the same organism may be split geographically, and one group may have less access to (say) sunlight, and may grow smaller while the other moves into a sunnier area and grows larger: which of them is the more perfect form of the species? It’s a bad question, a broken idea. One genotype (like a multiplicity) may express many phenotypes, within limits, due to other pressures. The “relations between rates of change” can replace the idea of “degrees of perfection.”
We’re given the definition of the deme, a level of abstraction broader than the individual organism and narrower than a species, to demonstrate the deme as a dynamical system that can be modeled in the language of attractors and bifurcatons.
Matters of Difference
DeLanda also returns to the egg metaphor, attempting to argue that that the comparison he made in the first chapter (which I covered) is stronger than a metaphor. The egg undergoes a spatial structuration and qualitative differentiation. “But in what sense can eggs and organisms be said to form spaces?” Here space is meant in a nonmetric, topological sense. Eggs start as being defined mostly by “chemical gradients and polarities”, with early embryos with “neighborhoods with fuzzy borders and ill-defined qualities”. This topological space will eventually develop a “rigidly metric anatomical structure as tissues, organs, and organ systems become better defined and relatively fixed in form.” Embryonic cells begin as pluripotent, capable of becoming many kinds of cells.
Multiplicities can form assemblages, or special configurations with other entities that are recognized as different at whatever scale. Multiplicities have an indefinite number of ‘capacities‘ (or ‘affordances’ but I won’t use that word here) to interact with other multiplicities. Capacities are symmetric- things affect and can be affected without the need for homogenization. Recall my opener from the prequel post: symmetry was relative to some transformation, and the same is true with all capacities. A capacity is relational, and is not a property of any of the component elements of an assemblage. Seen this way, the properties of assemblages can be called intensive. Similarly, the properties of homogenous components, linked together, can be called extensive.
The enlarged meaning of ‘intensive’ is related to the standard definition in the crucial role played by difference. Much as a thermodynamic intensive process is characterized by the productive role which differences play in the driving of fluxes, so in the enlarged sense a process is intensive if it relates difference to difference. Moreover, as the example of assembly processes based on adaptive components showed, the flexible links which these components afford one another allow not only the meshing of differences, but also endow the process with the capacity of divergent evolution, that is, the capacity to further differentiate differences.
Classical thermodynamics focuses on final equilibrium states. Deleuze (or DeLanda, rather,) argues that far-from-equilibrium cases systems reveal the elusive virtual, which can be hidden in an objective delusion when its attractors are not all actualized, disguising it to the observer as perhaps a simpler virtual system than it truly is (a “linearization” or a “concealment”).
“Another New Kind of Science”
Here is an essay by DeLanda, on intensive thinking and genetic algorithms.
Back to the book- DeLanda explores how one would investigate biological sciences from the Deleuzian perspective.
[…] unlike the linear and equilibrium approach to science which concentrates on the final product, or at best on the process of actualization but always in the direction of the final product, philosophy should move in the opposite direction: from qualities and extensities, to the intensive processes which produce them, and from there to the virtual.
Let me give a concrete example of what it would mean to return to the interior of a body in the process of being constituted. Biological categories, particularly those above species, tend to be created by observing similarities (or technically, homologies) among the anatomical parts of fully formed organisms. To the extent that the process which generates these organisms is ignored these static classifications conceal the virtual. But the development of a nonlinear, non-equilibrium approach to embryology has revealed a different, more dynamic way of creating classifications. A good example is provided by a new approach to the study of the tetrapod limb, a structure which can take many divergent forms, ranging from the bird wing, to the single digit limb in the horse, to the human hand and its opposed thumb. It is very hard to define this structure in terms of the common properties of all the adult forms, that is, by concentrating on homologies at the level of the final product. But focusing instead on the embryological processes that produce this structure allows the creation of a more satisfactory classification. As one author puts it, this new classificatory approach ‘sees limb homology as emerging from a common process (asymmetric branching and segmenting), rather than as a precisely repeated archetypal pattern’.
Returning to the interior of the tetrapod limb as it is being constituted would mean to reveal how one and the same ‘virtual limb’ is unfolded through different intensive sequences, some blocking the occurrence of particular bifurcations (those leading to the branching out of digits, for example), some enabling a full series to occur, resulting in very different final products. This step in the method, however, can only constitute a beginning. The reason is that it still relies on the notion of similarity or homology, even if this now characterizes processes as opposed to products. A second step needs to be added to explain the source of these process homologies. Or to put this differently, once we have revealed the intensive process behind a product we still need to continue our ascent towards the virtual structures that can only be glimpsed in that process but which explain its regularities. Before engaging in a technical discussion of this second step I would like to sketch it in outline by returning to the metaphor which opened this chapter: a topological space which differentiates and divides its continuity as it becomes progressively more rigidly metric following a cascade of symmetry-breaking events.
Extensive structures would constitute the counterpart of the bottom level, while intensive processes would be the counterpart of the intermediate levels, each one representing a geometry which is not fully metric but which can, in fact, be metricized.47 The top level, an
ideally continuous and relatively undifferentiated space, would be the counterpart of the virtual. I use terms like ‘top’ and ‘bottom’ here informally, with no suggestion that these spaces actually form a hierarchical structure. A better image here would be a nested set of spaces, with the cascade acting to unfold spaces which are embedded into one another. Another important qualification is that each one of the spaces that comprises this nested set is classified not by its
extensities or its qualities, but by its affects, that is, by its invariants under a transformation (or group of transformations). In other words, what matters about each space is its way of being affected (or not affected) by specific operations, themselves characterized by their capacity to affect (to translate, rotate, project, bend, fold, stretch). Without this caveat, we could run the danger of circularity, since the extensive properties of the bottom level would be used to define the other levels as well.
According to DeLanda, this metaphor produces a direction for a theory of the virtual: a virtual continuum Deleuze calls the plane of consistency, a “space of spaces” with each component space “having the capacity spaces having a capacity of progressive differentiation”. The remainder of the chapter is a rigorous philosophical transformation intended to detach “these concepts [of the virtual] of their [frankly already pretty abstract] mathematical concepts.”
- By “extending each singularity into an infinite series, and defining these series without the use of metric or quantitative concepts, multiplicities can become capable of forming a heterogenous continuum.”
- The “Quasi-cause” is an operator with a “pure capacity to affect” (not generate). More on this in the next chapter, apparently. “Roughly, the task which the quasi-causal operator must accomplish is to create among the infinite series springing from each singularity ‘resonances or echoes’, that is, the most ethereal or least corporeal of relations.” This concept is defined via abstract communication theory, on how communication channels open (the mere existence of a signal tells something about the likeliness of an event occurring.) “Information transmission occurs with every physical process.”
This part of the chapter is a little painful/obtuse. It’s laudable, though, that Deleuze devoted so much energy to stamping out every dim glow of essentialism he thought he saw.
Unlike the a priori grasp of essences in human thought postulated by those who believe in such entities, there would be an empiricism of the virtual. The concepts of virtual multiplicity, quasi-causal operator and plane of consistency would be, in this sense, concrete empirico-ideal notions, not abstract categories.
“In the vicinity of the bifurcation the capacity to transmit information is maximized.” This idea leads to the hypothesis that the “specialized hardware which living organisms use to process information may have required that evolutionary forces kept early organisms poised at the edge of a phase transition, or what amounts to the same thing, away from any stable attractor”. From a “pioneer in this field of research”, Christopher Langton:
Living systems are perhaps best characterized as systems that dynamically avoid attractors . . . Once such systems emerged near a critical transition, evolution seems to have discovered the natural information processing capacity inherent in these near-critical dynamics, and to have taken advantage of it to further the ability of such systems to maintain themselves on essentially open-ended transients . . . There is ample evidence in living cells to support an intimate connection between phase transitions and life. Many of the processes and structures found in living cells are being maintained at or near phase transitions. Examples include the lipid membrane, which is kept in the vicinity of a sol-gel transition; the cytoskeleton, in which the ends of microtubules are held at the point between growth and dissolution; and the naturation and de-naturation (zipping and unzipping) of the complementary strands of DNA.
In short, this second chapter meant to detail how intensive processes might generate actual forms, and how these individuation processes are immanent in the physical world- how the actual produces the virtual (“concrete mechanisms of immanence”). In order to complete his replacement of the essence, the third chapter is on the actualization of the virtual in time.