Rhizomes: Cultural Studies in Emerging Knowledge: Issue 36 (2020)

Black Hole Materialism

Christopher Neil Gamble
University of Washington

Thomas Nail
University of Denver


Abstract: The Euro-Western tradition has long considered matter to be essentially non-relational, passive and mechanical. Matter, that is, is thought to consist of elementary particles that remain internally unchanged while moving inside of, or against, an equally unchanging or fixed background of space, time, or both. Consequently, matter’s behavior has been seen as obeying—either fully or probabilistically—preexisting and invariant natural laws.

In our paper, we first take a brief tour through three major traditions of Western materialism in order to demonstrate how this basic picture has remained remarkably stable up to the present. We then argue that recent physics research and quantum gravity theorizing about black holes provide an unprecedented opportunity to revolutionize our understanding of matter by understanding it as inherently relational, indeterminate, and generative. Our aim in doing so is to show that black hole physics has enormous interdisciplinary consequences for the history, philosophy, and science of materialists.


I. The History of Materialism

Classical Mechanics. The first major Euro-Western tradition of materialism was Greek atomism. As is well known, Leucippus, Democritus, and Epicurus all taught that all things—from the biggest stars to the smallest insects or speck of dirt—are formed by the collisions, compositions, and decompositions of tiny, discrete, and indivisible “atoms” careening perpetually through a vast spatial void. Eternal and unchanging, the atoms’ only differentiating attributes were their varying shapes and sizes, which enabled them to join together into countless combinations that resulted in the full scope and diversity of the perceptible world at large. For Leucippus and Democritus, these fundamental particles moved only along unique predetermined trajectories, whereas in Epicurus they occasionally swerved spontaneously onto others. In finding reality to have a fundamentally closed, immutable nature, however, both accounts nevertheless maintained the very same mechanistic conception of matter and its relationship to void or space.

For the atoms, that immutability results in a rather profound irony. Ostensibly, those constituent elements produce all of perceptible reality. Nevertheless, the full range of possible atomic compounds—and hence, of resulting sensible objects—preexists any compound’s realization and so remains just as eternally fixed and unchanging as the atoms own pre-given shapes and sizes. Certain combinations invariably result in lead, for example, whereas others result just as invariably in iron. Accordingly, then, whether they were capable of swerving or not, the atoms exerted zero creative agency over the character of their own productions. Instead, they remained essentially non-generative, non-relational vessels that “create” merely by passively realizing preexisting possibilities.

A similar situation obtains in relation to the immutable (non-)nature of what the atomists called “void.” An infinite background emptiness that persists to a greater or lesser extent in (or as) the space between atoms, void also in fact plays an integral role in constituting the sensible world. For example, in explaining lead’s relatively greater density than iron, Democritus argued that the atoms of the former fit more closely together, and thus permit less void between them, than do those of the latter. As this example illustrates, both metals reliably possess their respective defining properties only on condition that void (a) lacks any positive characteristics of its own (which could differentially interact with the atoms) and (b) remains utterly unaffected by the movements and combinations of the atoms that occur in or through it.

Taken together, the atomists described reality as a closed or bounded system whose productions could be exhaustively explained in terms of specific effects following necessarily and absolutely from particular causes. In doing so, they also positioned themselves as external, objective observers of that closed system, which remained unchanged by their observations of it. From that vantage, they could deduce and discover invariant, preexisting laws that would reveal reality’s underlying causal nature to them.

In short, the atomists’ materialist account of reality entailed a mechanistic conception of matter as inherently non-generative and non-relational, a background-dependent conception of space, and the immutability of both. The importance of this materialist account is difficult to overstate, especially to the history and ongoing practice of science. As we will see, however, as the prevailing cosmology changes, this concept of matter appears increasingly obsolete.

Statistical Mechanics. The second major materialist tradition emerged in the nineteenth century. Treating matter as if it moved randomly, modernist descriptions relied heavily on probability theory and statistics to predict it. However, matter’s seeming randomness was in fact merely due to practical limitations only. Fundamental particles (molecules, atoms, genes, isotopes, and so on) were simply too small and numerous for humans to observe all at once. For Laplace, Boltzmann and others, then, matter continued to be just as fully determined as it was for the atomists (albeit without any Epicurean spontaneity). Moreover, in adopting Newtonian notions of a fixed background of empty absolute space and universal time, modern materialism also continued to see matter as ultimately non-relational, passive, and obedient to invariant natural laws.

Quantum Mechanics. The third major materialism was quantum mechanics. In its initial formulation by Niels Bohr, Erwin Schrödinger and Werner Heisenberg, and much to the disappointment of Albert Einstein, quantum mechanics abandons a deterministic understanding of matter and finds matter instead to be inherently probabilistic. Due to the “measurement problem,” as it has tended to be understood, there is a fundamental limit on the precision with which matter can be known or predicted. As Heisenberg formulated it in his famous uncertainty principle, for example, there is an inherent limit to how precisely it is possible to know both a particle’s position and its momentum simultaneously. Beyond that limit, determinism dissolves into probability distributions.

As developed subsequently in quantum field theory, moreover, particles no longer move within an empty or smooth surface but are understood to be the excitations of fields that constantly jitter like violent waves with the vacuum fluctuations of so-called “virtual particles.” While those vacuum fluctuations are too small to observe directly or individually, collectively they nevertheless exert empirically measurable effects on particles that can be observed.

This account certainly paints a far more lively and dynamic picture of matter’s behavior than what had prevailed previously. Nevertheless, the vacuum fluctuations of the particle-fields of quantum field theory occur only within a preexisting and fixed background spacetime. In other words, quantum field theory works only by ignoring the gravitational field. Moreover, if the measurement problem is understood as marking a purely epistemological limit, as it generally is, then despite the continual vacuum jittering, matter is still treated as if it cannot generate any novel trajectories for itself. The total set of possible trajectories, in other words, remains just as eternal and unchanging as in the atomists’ account. And thus, matter remains an essentially passive, non-relational substance confined to fixed mathematical and epistemological probability ranges.

Despite their differences, then, all three of these major kinds of materialism nonetheless treat matter as essentially passive and treat space and time as fixed, background givens.

II. Loop Quantum Gravity

By ignoring spacetime, quantum theories effectively leave gravity to the macro-realm of the theory of general relativity. The effort to unify quantum theory with general relativity in a single framework is called quantum gravity theory, which many consider the, yet-to-be experimentally confirmed, holy grail of contemporary physics.

Quantum gravity theory begins by extending quantum formalisms all the way down to the lowest quantifiable limit of Planck’s constant. At this scale, there is no background-dependent frame of spacetime or gravity by which to measure particles. There, the geometry of spacetime is measured instead by the yardstick of the Planck length (1.6 × 10-35 meters). In quantum gravity theory, the Planck length thus operates as a kind of natural cut-off of all measurable reality and so also acts as the foundational point of reference for the application of the same probability-based mathematical framework of quantum theory but applied now to the behavior of spacetime itself as a quantum gravitational field. The radical consequence of this move for the theory and history of materialism is that it has finally released matter from the metaphysical cage of a fixed background spacetime.

The problem with combining general relativity and quantum mechanics, however, is that it tells us that if we could physically measure spacetime all the way down to the Planck-scale, doing so would require energies so high that it could actually create a micro-black hole. At the Planck-level, then, some theorize that spacetime itself dissolves into an ocean of “virtual” black holes bubbling wildly in and out of (virtual) existence like a kind of “spacetime foam.” As the smallest fluctuations of spacetime, these micro-black holes provide a dramatic manifestation of the Planck-limit, a zone of utter ontological indeterminacy between “reality” and “non-reality,” or “being” and “nothingness.” Importantly, while virtual black holes share this ontological indeterminacy with the virtual particles of quantum field theory, the crucial difference is that the latter are fluctuations of a vacuum in (a smooth, background) spacetime, whereas the former are fluctuations of spacetime itself, thereby rendering even what is ostensibly the most fundamental or basic level of reality inherently indeterminate as well.

Both of the two major prevailing quantum gravity theories, loop quantum gravity and string theory, look for the emergent structure of spacetime at the limits of where it breaks down: black holes. Despite their important contributions, however, our argument is that both theories nevertheless end up preserving certain mechanistic assumptions about matter by ignoring the indeterminacy of black holes.

Loop quantum gravity, for example, is rooted in Carlo Rovelli’s deeply relational account of quantum mechanics in which all entities manifest only through particular interactions (or “observations”) that may or not involve humans. This relational ontology comes to an abrupt end, however, right at the Planck-scale. Below the Planck-scale, Rovelli maintains, there is simply “nothing,” and thus spacetime fundamentally comprises discrete “atoms” or “grains” whose possible magnitudes have discrete Planckian units. Even though everything, including spacetime, manifests relationally, for Rovelli, those manifestations still obey predetermined and unchanging probability distributions that are ultimately rooted in (Planck-based) possibilities—as numerous as those might be. The problem is that Rovelli treats the Planck-scale as if it were an absolute, preexisting basis for an unchanging mathematical formalism that humans simply discovered and not, as we argue it is, an active zone of generative, experimental indeterminacy. In our view, the more parsimonious account is that there simply is no such non-relational limit that is not always co-constituted by acts of observation (human or not). In a sense, Rovelli has tried to bury the measurement problem at the Planck-scale in an effort to rescue a mathematical account of matter’s behavior even there. In doing so, however, he must erase what could instead be affirmed as matter’s inherent quantum indeterminacy by limiting matter in advance to the possibility range of only Planck-sized spacetime changes.

Given its pivotal role in prevailing math and physics, the Planck-scale certainly seems to mark a formal lower limit to reality as scientists are currently able to measure and observe it. Formal limits, however, are not the same as experimental or empirical ones. And as Rovelli himself readily acknowledges, loop quantum gravity’s current theorizing does not precisely converge with physical observations and, thus, “something is missing.” We agree. But instead of simply continuing the quest to find an ever more precise or comprehensive means of explaining away that “something” by rendering it quantifiable, we would like to suggest a different response. Instead, we suggest that we also ought to embrace that “something” as indexing a physically minute yet dramatic and vivid instance of matter’s inherent ontological indeterminacy and creativity.

Let’s look now at how string theory fares on the issue of black hole materialism.

III. String Theory

Following his shocking discovery that black holes radiate energy, Stephen Hawking advanced the even more shocking claim that black holes swallow and destroy information. Because the destruction of information would directly violate the law of information conservation, Hawking’s claim became known as the “black hole information paradox,” motivating numerous attempts to resolve it.

In 1993, string theorist Leonard Susskind made a brilliant contribution to such efforts with his theory of “black hole complementarity.” Returning to quantum mechanical basics, Susskind argued that two mutually exclusive explanations for what happens to information heading towards a black hole are in fact equally valid, depending on the observational vantage. Specifically, if we watched someone fall into a black hole, we would see them slow down and stretch out at the horizon, eventually becoming scrambled into radiation, whereas the person falling in would cross the horizon without noticing anything unusual until being destroyed much later.

Black hole complementarity eventually led to the “holographic principle,” string theory’s now highly influential proposed solution to the information paradox. Holograms, in essence, are three-dimensional projections produced by light interacting with information that is stored on a two-dimensional film. Similarly, the holographic principle essentially posits that black holes project three-dimensional holograms of their own information, which they store on the two-dimensional surface area of their horizons. Which direction the hologram is projected, moreover—outside or inside of the horizon—simply depends on the location of the observer, who acts as the light-source reconstructing the information.

While the holographic principle constitutes an ingenious solution to both the information paradox and the measurement problem, the math on which it rests again works only by bracketing out or erasing reality’s apparent indeterminacy (e.g., information’s location). We see no empirical basis for this bracketing out, however, only that it enables formal, mathematical prediction by uncritically upholding the ancient Western assumption of matter’s intrinsic passivity. Instead of assuming in advance that the universe is made up of a fixed and finite total amount of information, all of which must also have a determinate material location, we believe the theory of black hole complementarity could help us appreciate and affirm matter’s inherent generative indeterminacy.

We thus argue, by contrast, that the more compelling and parsimonious view is that the measurement problem simply cannot be overcome: indeterminacy and complementarity are themselves fundamental. If measuring Planckian spacetime results in micro-black holes, this means measurement is integral to the identity and relationship between not only black holes and spacetime but to the Planck-limit as well. Moreover, as Rovelli and others have argued, quantum measurement applies to all interactions, whether involving humans or not. And if we ourselves are material beings through and through, then matter produces the Planck-scale in our presence just as it must produce non-Planckian based scales in our absence. And thus, contrary to both string theory and loop quantum gravity, we think the Planck-limit (and its possible sea of micro-black holes) does not reveal a preexisting limit to reality but instead confronts us with matter’s fundamentally indeterminate, relational and generative nature.

V. Black Hole (New) Materialism

One key implication of this elaboration is that there simply cannot be any absolute, preexisting limit to material reality that would remain unchanged or unaffected by our observations of it. Quantum field theory’s notion of vacuum fluctuations are a means of rethinking “void” as a spectacularly lively and creative place. In particular, we are inspired by the physicist Karen Barad’s new materialist account of these quantum fluctuations as a vivid dramatization of matter’s inherent and creative ontological indeterminacy. As she makes clear, moreover, her argument “is [not] limited to the domain of the small. On the contrary, the play of indeterminacies is ontologically prior to notions of scale and, more generally, space and time.”

We agree with Barad’s argument that matter’s indeterminacy precedes—and thus also pervades—all scales of reality. However, while Barad has developed and discussed this interpretation in relation to quantum mechanics and quantum field theory, to our knowledge she has yet to directly theorize indeterminacy at the level of quantum gravity or in relation to black holes. Part of the motivation of this paper is thus to develop a generative or performative new materialist account of spacetime and blackholes.

To be very clear, we certainly have no wish to deny or in any way diminish the extremely impressive achievements of Western math and science in explaining and predicting matter’s behavior. Nevertheless, as every serious scientist will readily admit, we are not yet—and indeed never will be—able to fully explain and predict matter’s behavior. What we therefore wish to propose, in light of our discussion about black hole indeterminacy, is a different way of understanding to the limits of our knowledge. Rather than continuing to see it as merely marking current limits of our knowledge of inherently non-generative matter, we wish to suggest that there are empirically compelling as well as ethically and existentially politically urgent reasons for a very different understanding of the nature of matter as indeterminate.

In our view, even an omniscient being with infinite knowledge would not be able to exhaustively quantify matter because matter is inherently indeterminate, generative, and relational. Moreover, since we humans are just as fully material as anything else, any act of human observation of matter—including even ostensibly “formal” kinds of observations—must also play a role in constituting matter. And so, the belief that matter could ever be circumscribed as a closed or bounded system with a fixed range of possibilities that obeys invariant laws seems to us to amount to a dangerous delusion rooted in human hubris or fear. Such a view may well serve to maintain the idea that we are cosmically exceptional beings given our ostensible ability to stand apart from mechanical matter and learn how to exert an increasing degree of control over it. In the face of the rapidly escalating global ecological crisis that has in many ways resulted from efforts to realize such a fantasy, however, there seems to be abundant—and urgent—reason to rethink such a view.

IV. Generative Materialism

It is our position that black holes provide the historical foundations for a whole new theory of materialism, in which matter is no longer passive, negative, or even simply probabilistic. This new performative or “pedetic” materialism is defined by three core features that we think have important consequences across the sciences and humanities. The full elaboration of these features requires a much longer paper so in its place we would like to simply flag them here as the beginnings of a new black hole materialism.

  1. If virtual, micro-black holes index the primordial creativity of all matter, spacetime, and quantum fluctuations, then all matter, from the Planck scale up to the macro level must also be defined by a pedetic or generative indeterminacy. This means that natural laws, including the Planck limit and probabilities are products of a more primary indeterminate process subject to iterative change and revision over time.
  2. Matter is not a continuous or discrete substance moving in spacetime; it produces spacetime. Black holes are neither passive matter nor empty voids but active processes: the transformative mutagens of the Cosmos. If we consider what is known about black holes without bracketing out the measurement problem and indeterminacy, it points to an interpretation in which matter is an endlessly fluctuating, creative process more fundamental than spacetime or the Planck scale—which must be emergent features of those quantum processes not background laws governing them. At the bottom of a black hole we do not find the infinitely small singularity that general relativity predicted, but rather indeterminately moving energy—without any more fundamental explanation.
  3. Black hole indeterminacy points to an interpretation in which matter manifests itself relationally and immanently. Matter is not the passive or random effect of something else external to it.

Notes

  1. “Atom” derives from the Greek word atomos, meaning “uncuttable.”
  2. DK 68A135.
  3. Ibid.
  4. What Daniel W. Graham describes as the atomists’ “closed system of natural explanation,” Explaining the Cosmos: The Ionian Tradition of Scientific Philosophy (Princeton University Press, 2006), 15.
  5. Indeed, this tradition had an enormous influence on modern science, where it was later supplemented by deism and vitalism from the fifteenth to eighteenth centuries. 
  6. The “Lamb shift,” for example, refers to the tiny yet measurable effect of vacuum fluctuations on the energy levels of a hydrogen atom’s electron.
  7. This assumption is defensible on practical grounds just above the Planck-scale, where quantum gravitational effects are negligible. At the Planck-scale, however, spacetime appears to warp and jitter wildly, which would wreak havoc on the theory’s descriptions of the other quantum fields. For this reason, quantum field theory effectively ignores gravity and treats spacetime as a fixed background given instead.
  8. This is why it is generally discussed in terms of “uncertainty,” in contrast to what we are referring to as (ontological) indeterminacy, a distinction we adopt from Karen Barad’s Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning (Durham, NC: Duke university Press, 2007).
  9. “The cosmological constant happens to function in the theory as a natural IR cut-off, next to the UV cut-off provided by the Planck length.” Carlo Rovelli, Covariant Loop Quantum Gravity: An Elementary Introduction to Quantum Gravity and Spinfoam Theory (Cambridge: Cambridge University Press), chapter 6.
  10. There is ongoing debate over just how background-independence should be defined and which theories do or do not exhibit it. The kind of background-independence we describe in this section corresponds most closely to that of loop quantum gravity, although in its own way string theory arguably exhibits background-independence as well. For a more detailed discussion of background-independence in quantum gravity theories, see Steven Weinstein and Dean Rickles, “Quantum Gravity,” in Edward N Zalta, ed., The Stanford Encyclopedia of Philosophy (Summer 2018 Edition) https://plato.stanford.edu/archives/sum2018/entries/quantum-gravity/.
  11. Cosmic black holes, as Einstein first predicted, form when massive stars become so dense near the end of their life cycle that they collapse. When combined with quantum mechanics, however, the same factors should lead to the formation of micro-black holes as well, which is precisely the predicted outcome of measuring a small enough region of spacetime. In essence, measuring such a region would require locating something in it, like a photon. Per Heisenberg’s uncertainty principle, the more precisely we determine the photon’s position—in an ever-smaller region—the greater its momentum and energy will become. Greater energy of course, per Einstein, means greater mass. The photon’s mass would thus become increasingly large and concentrated within a smaller and smaller spacetime region, eventually causing that region to collapse upon itself, thus forming a micro-black hole right around the Planck length.
  12. Stephen Hawking, “Virtual Black Holes,” Physics Review D 53 (March 1996): 3099. See also Gerard ’t Hooft, “Virtual Black Holes and Space–time Structure,” Foundations of Physics (2018): 1-16; Carlo Rovelli, Reality Is Not What It Seems: The Journey to Quantum Gravity (New York: Riverhead Books), 248.
  13. Carlo Rovelli, “Relational Quantum Mechanics,” International Journal of Theoretical Physics 35, no. 8 (1996): 1637-1678.
  14. “Below [the Planck scale], nothing more is accessible. More precisely, nothing exists there.” Rovelli, Reality, 152.
  15. For a discussion of these values or “eigenvalues” and how to derive them, see Rovelli, Reality, 165.
  16. In essence, which of the possible values spacetime actualizes in a given relation occurs according to the same probabilistic logic that defines quantum mechanics generally. For an accessible discussion of this view, see Rovelli, Reality, 175-196.
  17. For an accessible discussion of this view, see Rovelli, Reality, 175-196.
  18. Carlo Rovelli, “Loop Quantum Gravity,” in Bernard D’Espagnat and Hervé Zwirn, eds., The Quantum world: Philosophical Debates on Quantum Physics (Cham, Switzerland: Springer International Publishing, 2014), 288.
  19. This view was based on general relativity’s contention that anything crossing a black hole horizon would be unable to escape back out, because doing so would require exceeding the speed of light, and would instead simply accelerate towards the certain doom of the singularity lying at the black hole’s center.
  20. Leonard Susskind, The Black Hole War: My Battle with Stephen Hawking to Make the World Safe for Quantum Mechanics (Boston, Mass: Little, Brown, 2009).
  21. Leonard Susskind, Larus Thorlacius, and John Uglum, “The Stretched Horizon and Black Hole Complementarity,” Physical Review D. 48, no. 8 (1993): 3743–3761.
  22. Complementarity is a foundational notion to quantum mechanics, first proposed by Niels Bohr. In the famous double-slit experiment, for example, light appears as either a wave (extended in space) or a particle (localized in space), which are inherently mutually exclusive notions, depending on the experimental arrangement or apparatus observing it.
  23. Holograms are three-dimensional projections produced by light interacting with information stored on a two-dimensional film. Similarly, the holographic principle essentially posits that black holes project three-dimensional holograms of their own information, which they store on the two-dimensional surface area of their horizons. Thus, which direction the hologram is projected—outside or inside of the horizon—simply depends on the location of the observer, who acts as the light-source reconstructing the information.
  24. To date, however, the holographic principle is only able to account for very the formal, hypothetical kind of black hole that occurs in an enclosed kind of spacetime called “Anti de Sitter Space” (ADS). “With one exception,” Susskind notes, ADS black holes “have all the features of ordinary black holes” The Black Hole War, 406-407. That exception is that ADS black holes do not evaporate. Instead, they are contained inside a kind of hermetically sealed spacetime box that lets absolutely nothing out. In our view, then, by enclosing spacetime within it, this box renders black holes amenable to causal quantification much the way seeing nature as an enclosed system rendered it amenable to formal analysis for the atomists.
  25. This relational view might even provide a basis for what Rovelli calls the “open possibility” that “dark matter is composed of black holes,” Reality, 128.
  26. As quantum physics theorist and philosopher Karen Barad puts it in her wonderful ode to “nothingness,” “the play of indeterminacies is ontologically prior to notions of scale and, more generally, space and time.” Karen Barad, “What Is the Measure of Nothingness: Infinity, Virtuality, Justice,” Documenta, 13 (2012): 5-6n2 (emphasis added).
  27. In Baradian terms, an “intra-action” is distinguished from “inter-action.” In an inter-action, two entities determinately preexist their encounter. An intra-action, in contrast, involves the determinate, relational constitution of inherently indeterminate matter (e.g., the constitution of inherently indeterminate light as a wave or particle when part of a particular experimental apparatus).
  28. “What Is the Measure of Nothingness: Infinity, Virtuality, Justice,” Documenta, 13 (2012): 5-6n2 (emphasis added).
  29. In an earlier, succinct elaboration of this argument, moreover, Barad addresses the Planck scale directly: “Planck's constant marks something even more fundamental about nature. While it is sometimes said that the quantum is a measure of the graininess of nature, what is at issue is actually not a particular property of nature but the very nature of nature. The sense in which this discontinuity is an ‘essential’ one is not that nature has a fixed essence, but that nature's lack of a fixed essence is essential to what it is,” Meeting the Universe Halfway, 422n15 (emphasis added).
  30. The only two—passing—mentions of quantum gravity theories that we are aware of Barad making are in Meeting the Universe Halfway: 351 and 462n101.
  31. In many ways that we lack the space to do justice to here, the generative theory of materialism that we advance in this paper is deeply indebted to, inspired by and fundamentally consistent with the brilliant work of Karen Barad. Nevertheless, whereas Barad has developed and discussed her account in relation to quantum mechanics and quantum field theory, to our knowledge she has yet to directly address quantum gravity or black holes in a sustained way. As a result, her account does not engage the literature that would directly enable the development of a generative view of spacetime itself, which is our aim in this paper. See her path-breaking book, Karen Barad, Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning (Durham, NC: Duke university Press, 2007).
  32. For further development of these features see: Christopher N. Gamble, Joshua S. Hanan & Thomas Nail, “What is New Materialism?” Angelaki 24 (6): 111-134 (2019), doi:10.1080/0969725X.2019.1684704; Thomas Nail, Being and Motion (Oxford: Oxford University Press, 2018); Christopher N. Gamble & Joshua S. Hanan (2016) Figures of entanglement: special issue introduction, Review of Communication, 16:4, 265-280.

  33. Cite this Essay

    Gamble, Christopher N. and Thomas Nail. “Black Hole Materialism.” Rhizomes: Cultural Studies in Emerging Knowledge, no. 36, 2020, doi:10.20415/rhiz/036.e08