These are notes on Manuel De Landa’s Intensive Science and Virtual Philosophy, which is a refactoring of Gilles Deleuze’s ontology (specifically, it’s intended as an introduction to Deleuzian thought for analytical philosophers or scientific-minded readers unfamiliar to Deleuze and his very continental methodologies: this dude has two very different, very important concepts called “differentiation” and “differenciation”. That’s what we’re up against.).

Also as a warning, I’m certainly doing more violence to De Landa’s work here than De Landa himself fears he may do in warping and repackaging Deleuze’s ontology for “outsiders”. I have no experience with Deleuze outside of this book, so this is a juvenile attempt to share as I’ve learned. I will happily edit on feedback from more knowledgeable readers.

I have only read the chapter I’m summarizing, so far. It seems reasonably self-contained. I liked some of the ideas although some implications are above my pay-grade. This first chapter (of four) is called The Mathematics of the Virtual. I haven’t even begun the second chapter yet so the sequel posts may not come quickly.

Transformation I: Symmetry Breaking

There are no essences, only processes. Apparent “essential properties” (eg. identities, similarities, oppositions) are results of processes. Species are not classified by their apparent qualities or their inner essences, but by their common descent and the intensive process of evolution by natural selection.

Here’s a more precise example from the book:

Imagine a set consisting of rotations by 90 degrees: {0, 90, 180, 270}. A “group” is a set of entities (with special properties) and a rule of combination for those entities. “Closure” is one of those properties, wherein if any two entities in the set are combined, the result is also a member of the set. This is true with our set above (as long as 360=0..)

We can classify geometric figures by their invariants: if I perform any of the above group’s rotations on a colorless cube, an observer who never sees the transformation doesn’t know that change occurred. We can say then that with this rotation transformation, the cube is invariant. The cube wouldn’t be invariant at 76 degree rotations at all. A sphere would, though. We can say that a sphere has more symmetry than the cube relative to the rotation transformation.

Traditionally, we have classified shapes by their static properties, but just now we’ve demonstrated how we can classify figures by how they respond to specific processes. The property of symmetry is relative to a particular transformation(s) performed, not to the figure itself.

The next step De Landa wants us to take is to “envision a process which converts one of the entities into the other by losing or gaining symmetry”.  By losing invariance to some transformations, a sphere can become a cube. We’ll call this process a symmetry-breaking transition. 

Phase transitions are examples of a form of symmetry-breaking in a physical process. There are critical points at which water changes state from solid to liquid to gas. In a pristine and uniform state, gas has “invariant properties under all translations, rotations, and reflections”. Solid crystals of ice, even ideally undisturbed and uniform, would not be invariant to as many transformations.

So, now we’ve very lightly sketched the idea of a process-centered understanding. Deleuze uses a lot of topological/dynamical systems terminology, which is confusing enough stuff without going totally continental with it. He replaces the concept of essence with the multiplicity. 


Multiplicities I

The tl;dr here: “A multiplicity is a nested set of vector fields related to each other by symmetry-breaking bifurcations, together with the distributions of attractors which define each of its embedded levels”. I had trouble making sense of this when I started reading, but I was eased into this definition.

A Deleuzian multiplicity takes as its first defining feature these two traits of a manifold: its variable number of dimensions and, more importantly, the absence of a supplementary (higher) dimension.


The resources in this case come from the theory of dynamical systems […] [I]n this theory manifolds are connected to material reality by their use as models of physical processes. When one attempts to model the dynamical behaviour of a particular physical object (say, the dynamical behaviour of a pendulum or a bicycle, to stick to relatively simple cases) the first step is to determine the number of relevant ways in which such an object can change (these are known as an object’s degrees of freedom), and then to relate those changes to one another using the differential calculus. A pendulum, for instance, can change only in its position and momentum, so it has two degrees of freedom. (A pendulum can, of course, be melted at high temperatures, or be exploded by dynamite. These are, indeed, other ways in which this object can change, they simply are not relevant ways from the point of view of dynamics.) A bicycle, if we consider all its moving parts (handlebars, front wheels, crank-chain-rear-wheel assembly and the two pedals) has ten degrees of freedom (each of the five parts can change in both position and momentum).

Next, one maps each degree of freedom into one of the dimensions of a manifold. A pendulum’s space of possibilities will need a two-dimensional plane, but the bicycle will involve a ten-dimensional space. After this mapping operation, the state of the object at any given instant of time becomes a single point in the manifold, which is now called a state space. In addition, we can capture in this model an object’s changes of state if we allow the representative point to move in this abstract space, one tick of the clock at a time, describing a curve or trajectory. A physicist can then study the changing behaviour of an object by studying the behaviour of these representative trajectories. It is important to notice that even though my example involves two objects, what their state space captures is not their static properties but the way these properties change, that is, it captures a process. As with any model, there is a trade-off here: we exchange the complexity of the object’s changes of state for the complexity of the modelling space. In other words, an object’s instantaneous state, no matter how complex, becomes a single point, a great simplification, but the space in which the object’s state is embedded becomes more complex (e.g. the three-dimensional space of the bicycle becomes a ten-dimensional state space).

Besides the great simplification achieved by modelling complex dynamical processes as trajectories in a space of possible states, there is the added advantage that mathematicians can bring new resources to bear to the study and solution of the physical problems involved. In particular, topological resources may be used to analyse certain features of these spaces, features which determine recurrent or typical behaviour common to many different models, and by extension, common to many physical processes.

So now, for example you can explore the tendencies of any two models with N degrees of freedom, even if the models are based on two very different “actual” objects.

singularity is a particular topological feature that has a large influence in the behavior of a processes’ trajectory through the space of possible states.

Singularities may influence behaviour by acting as attractors for the trajectories [of processes through state space]. What this means is that a large number of different trajectories, starting their evolution at very different places in the manifold, may end up in exactly the same final state (the attractor), as long as all of them begin somewhere within the ‘sphere of influence’ of the attractor (the basin of attraction). Given that, in this sense, different trajectories may be attracted to the same final state, singularities are said to represent the inherent or intrinsic long-term tendencies of a system, the states which the system will spontaneously tend to adopt in the long run as long as it is not constrained by other forces. Some singularities are topological points, so the final state they define as a destiny for the trajectories is a steady state. Beside these, Poincare also found that certain closed loops acted as attractors and called them ‘limit cycles’. The final state which trajectories attracted to a limit cycle (or periodic attractor) are bound to adopt is an oscillatory state. But whether we are dealing with steady-state, periodic or other attractors what matters is that they are recurrent topological features, which means that different sets of equations, representing quite different physical systems, may possess a similar distribution of attractors and hence, similar long-term behaviour.

Mechanism independence (the divorce from material details) is what makes singularities the “perfect candidates to replace essences”, as long as we’re careful not to fall into the trap of making singularities into a synonym for essences. A major difference is that while essences are clear and distinct, multiplicities are obscure- they are defined through progressive differentiation. An egg does not contain a distinct instruction set to create a chicken, but instead a complex of differing biochemical concentrations that react to each other and the environment to progressively become a chicken through a “complex cascade of symmetry-breaking phase transitions.


Transformations II: Progressive Differentiation

One system tendency (again, singularity) might break symmetry and be converted into another singularity. These transitions are called bifurcations “and may be studied by adding to a particular state space one or more ‘control knobs’ (technically, control parameters) which determine the strength of external shocks or perturbations to which a system being modelled may be subject. These control parameters tend to display critical values, thresholds of intensity at which a particular bifurcation takes place breaking the prior symmetry of the system.”

A state space structured by one point attractor, for example, may bifurcate into another with two such attractors, or a point attractor may bifurcate into a periodic one, losing some of its original symmetry. Much as attractors come in recurrent forms, so bifurcations may define recurrent sequences of such forms. There is a sequence, for instance, that begins with a point attractor which, at a critical value of a control parameter, becomes unstable and bifurcates into a periodic attractor. This cyclic singularity, in turn, can become unstable at another critical value and undergo a sequence of instabilities (several period-doubling bifurcations) which transform it into a chaotic attractor.

Physical process example: heating water. At low temperatures, you can observe simple, featureless thermal conduction. After a critical temperature, the legible rolls of water- thermal convection. After a higher still critical value for temperature: complicated patterns of turbulence. But this process is dependent on mechanisms we recognize- De Landa means to convince us that this sequence exists in completely different processes.

From biologist Brian Goodwin:

The point of the description is not to suggest that morphogenetic patterns originate from the hydrodynamic properties of living organisms . . . What I want to emphasize is simply that many pattern-generating processes share with developing organisms the characteristic that spatial detail unfolds progressively simply as a result of the laws of the process. In the hydrodynamic example we see how an initially smooth fluid flow past a barrier goes through a symmetry-breaking event to give a spatially periodic pattern, followed by the elaboration of local nonlinear detail which develops out of the periodicity. Embryonic development follows a similar qualitative course: initially smooth primary axes, themselves the result of spatial bifurcation from a uniform state, bifurcate to spatially periodic patterns such as segments [in an insect body], within which finer detail develops… through a progressive expression of nonlinearities and successive bifurcations… The role of gene products in such an unfolding is to stabilize a particular morphogenetic pathway by facilitating a sequence of pattern transitions, resulting in a particular morphology

Unlike the clear, general essence, the multiplicity is typically “divergent”. Different realizations of the same multiplicity may not resemble each other when they are actualized. On top of that, multiplicities beget processes, not final forms, so two products of the same multiplicity may look very dissimilar, “like the spherical soap bubble and the cubic salt crystal which do not resemble one another, but bear no similarity to the topological point guiding their production” (both are minimizing processes of a kind: minimizing surface tension for the bubble and minimizing bonding energy for the salt crystal- both are a single point attractor representing a point of minimal energy, a single topological form. More on topology shortly.).

So, again, while essences are clear and distinct, multiplicities are not. They are obscure. And they are not distinct, either, but “meshed together into a continuum.”

This further blurs the identity of multiplicities, creating zones of indiscernibility where they blend into each other, forming a continuous immanent space very different from a reservoir of eternal [legible, rational] archetypes.

Here is where De Landa would throw Deleuze’s differentiation/differenciation wrench into the mix. DifferenTiation is the “progressive unfolding of a multiplicity through broken symmetries. DifferenCiation is “the progress specification of the continuous space formed by multiplicities as it gives rise to our world of discontinuous spatial structures.” I will partially unpack these below, after another foray into mathematics.

Unlike a transcendent heaven which exists as a separate dimension from reality, Deleuze asks us to imagine a continuum of multiplicities which differenciates itself into our familiar three-dimensional space as well as its spatially structured contents.

The concept of length and area, for example are metric concepts- they are fixed, measureable relations of distance between points in Euclidean space. Topological space is non-metric, and can be stretched and warped without changing the nature of relations of points. “Being nearby” in topological space might not mean “being within so-many inches” but might mean something like “requiring less than two other connected nodes to reach.”

There are many geometries that have been invented throughout time. Euclid and Lobatchevsky, for instance, had geometries with metric space. Gauss and Riemann had differential geometries with the concept of the manifold built in. There are others: projective geometry, affine geometry, topology.

“Felix Klien [19th century mathematician] realized that all geometries known to him could be categorized by their invariants under groups of transformations, and that the different groups were embedded one into the other. In modern terminology this is equivalent to saying that the different geometries were related to each other by relations of broken symmetry.”

(Keep in mind the rotating cubes we started with at the top of the post).

Differential geometry is more symmetrical still, and topology is even more symmetrical and has much, much fewer classes of objects. In topology, a circle, square and circle are the same figure. A coffee cup and a donut can be the same. “Metaphorically, the hierarchy ‘topological-differential-projective-affine-Euclidean’  may be seen as representing an abstract scenario for the birth of real space. As if the metric space which we inhabit and that physicists study and measure was born from a nonmetric, topological continuum as the latter differentiated and acquired structure following a series of symmetry-breaking transitions.”

Although this morphogenetic view is a metaphor in that it is purely logical, De Landa argues that it can be ontologically understood as well.  He introduces new vocabulary:

Extensive properties are “metric”, intrinsically divisible properties including length/area/volume but also quantities (entropy, energy).

Intensive properties cannot be divided without involved a change in “kind” (that might affect extensive properties as well). Changing the temperature of water can induce symmetry-breaking if certain thresholds are passed, as we’ve touched on before. Pressure is also an intensive property.

We now have a language for concrete physical processes where an “undifferentiated intensive space (that is, a space defined by continuous intensive properties) progressively differentiates, eventually giving rise to extensive structures (discontinuous structures with definite metric properties). Physicists today posit that at extremely high, birth-of-the-universe temperatures, the four fundamental forces “lose their individuality and blend into one, highly symmetric, force.”

The hypothesis is that as the universe expanded and cooled, a series of phase transitions broke the original symmetry and allowed the four forces to differentiate from one another. If we consider that, in relativity theory, gravity is what gives space its metric properties (more exactly, a gravitational field constitutes the metric structure of a four-dimensional manifold), and if we add to this that gravity itself emerges as a distinct force at a specific critical point of an intensive property (temperature), the idea of an intensive space giving birth to extensive ones through progressive differentiation becomes more than a suggestive metaphor.


Multiplicities II

Given that this ontological difference is key to the idea of a Deleuzian multiplicity, I will need to explain how state spaces are constructed. First of all, it is important to distinguish the different operators involved in this construction. As I said above, given a relation between the changes in two (or more) degrees of freedom expressed as a rate of change, one operator, differentiation, gives us the instantaneous value for such a rate, such as an instantaneous velocity (also known as a velocity vector). The other operator, integration, performs the opposite but complementary task: from the instantaneous values it reconstructs a full trajectory or series of states.


These two operators are used in a particular order to generate the structure of state space. The modelling process begins with a choice of manifold to use as a state space. Then from experimental observations of a system’s changes in time, that is, from actual series of states as observed in the laboratory, we create some trajectories to begin populating this manifold. These trajectories, in turn, serve as the raw material for the next step: we repeatedly apply the differentiation operator to the trajectories, each application generating one velocity vector and in this way we generate a velocity vector field. Finally, using the integration operator, we generate from the vector field further trajectories which can function as predictions about future observations of the system’s states. The state space filled with trajectories is called the ‘phase portrait’ of the state space.33 Deleuze makes a sharp ontological distinction between the trajectories as they appear in the phase portrait of a system, on one hand, and the vector field, on the other. While a particular trajectory (or integral curve) models a succession of actual states of a system in the physical world, the vector field captures the inherent tendencies of many such trajectories, and hence of many actual systems, to behave in certain ways. As mentioned above, these tendencies are represented by singularities in the vector field, and as Deleuze notes, despite the fact that the precise nature of each singular point is well defined only in the phase portrait (by the form the trajectories take in its vicinity) the existence and distribution of these singularities is already completely given in the vector (or direction) field.


From another resource:

The vector field is the real source of the regularities or propensities in the population of possible histories (33). Unlike trajectories, a vector field is not composed of individuated states, but of instantaneous values for rates of change. Individually, these instantaneous rates have in fact no reality but collectively they do exhibit topological invariants (singularities). Ontologically, these invariants of a vector field are topological accidents, points in the field which happen to be stationary; Deleuze argues that these topological accidents should be given the ontological status of an event (a perfect storm? a scientific concept for this would be stochastic resonance). A key concept in the definition of a multiplicity is that of invariant, but invariances are always relative to some transformation. In other words, whenever we speak of the invariant properties of an entity we also need to describe an operator or group of operators capable of performing rotations, translations, projections, foldings, and a variety of other transformations on that entity. So the ontological content of the virtual must also be enriched with at least one operator.


The field of vectors vs. the integral curves are “essentially two distinct mathematical realities.”

Trajectories approach attractors asymptotically (they never actually reach it). So attractors are never actualized even though trajectories do represent the states of objects in the world. Although they are not actual, they are “real” and have definite effects on actual entities. For example, they generate asymptotic stability. “Small shocks might dislodge a trajectory from its attractor but as long as the shock is not too large to push it out of the basin of attraction, the trajectory will naturally return to the stable state defined by the attractor.”

Separately from trajectory stability, there is the stability of the distribution of the attractors themselves (call this structural stability.) Typically these systems are stable and thus recur in many different physical processes. If disturbed significantly enough, they may change or bifurcate into a different distribution. “Such a bifurcation event is defined as a continuous deformation of one vector field into another topologically inequivalent one through  structural instability.”

Which brings us back to the technical definition of a multiplicity that I pasted before:

A multiplicity is a nested set of vector fields related to each other by symmetry-breaking bifurcations, together with the distributions of attractors which define each of its embedded levels. This definition separates out the part of the model which carries information about the actual world (trajectories as series of possible states) from that part which is, in principle, never actualized.


So it’s clear that “actual” is not synonymous with “real”.  Deleuze (or in our case, De Landa) introduces the idea of the “virtual”, which is opposed to the “actual” (but is real). Deleuze writes, “The virtual must be defined as strictly a part of the real object – as though the object had one part of itself in the virtual into which it plunged as though into an objective dimension… the reality of the virtual consists of the differential elements and relations along with the singular points which correspond to them. The reality of the virtual is structure. “

This project needs to include, besides defining multiplicities as I did above, a description of how a population of multiplicities can form a virtual continuum, that is, it needs to include a theory of virtual space [Chapter 2]. Similarly, if the term ‘virtual multiplicity’ is not to be just a new label for old timeless essences, this project must include a theory of virtual time [Chapter 3], and specify the relations which this non-actual temporality has with actual history. Finally, the relationship between virtuality and the laws of physics needs to be discussed, ideally in such a way that general laws are replaced by universal multiplicities while preserving the objective content of physical knowledge. Getting rid of laws, as well as of essences and reified categories, can then justify the introduction of the virtual as a novel dimension of reality. In other words, while introducing virtuality may seem like an inflationary ontological move, apparently burdening a realist philosophy with a complete new set of entities, when seen as a replacement for laws and essences it actually becomes deflationary, leading to an ultimately leaner ontology.


I’m still digesting it. The next chapter may or may not be my next post.