Participatory Spirituality for the 21st Century
A ‘power law’ refers specifically to a statistical relationship between quantities, such that a change in one quantity has a proportional change in another. One property of this law is scale invariance, otherwise known as ‘scale-free,’ meaning the same proportion repeats at every scale in a self-similar pattern. Mathematical fractals are an example of such a power law. Power laws are taken as universal and have been applied to any and all phenomena to prove the universality of this law.
However, a recent study (Broido and Clauset, 2019) claims that “scale free networks are rare.” They conducted an extensive review of one thousand social, biological, technological and information networks using state of the art statistical methods and concluded what the title of their article states. To the contrary, “log-normal distributions fit the data as well or better than power laws.” And that scale-free structure is “not an empirically universal pattern.” Hence it should not be used to model and analyze real world structures.
So why the fascination with trying to fit nearly all phenomena into the scale-free paradigm? Holme (2019) reviews the above article and the overall power law issue and notes that “in the Platonic realm of simple mechanistic models, extrapolated to infinite system size, the concepts of emergence, universality and scale-freeness are well-defined and clear. However, in the real world, where systems are finite and many forces affect them, they become blurry.” Klarreich (2018) reviewed an earlier version of Borodo’s paper and noted that per mathematician Steven Strogatz, in physics there is a “power law religion.”
So what is the root of this religion? Holme nailed it when he said the power law universally applies “in the Platonic realm.” This is a long-held, guiding myth that has remained strong in math. Lakoff and Nunez (2001) dispel this myth, noting that there is no proof of an a priori mathematics; it is purely a premised axiom with no empirical foundation. Just like the conception of God it is religious faith. We can only understand math with the mind and brain, so it requires us to understand how that brain and mind perceives and conceives. Hence there is no one correct or universal math. There are equally valid but mutually inconsistent maths depending on one's premised axioms (354-55). This is because math is also founded on embodied, basic categories and metaphors, from which particular axioms are unconsciously based (and biased), and can go in a multitude of valid inferential directions depending on which metaphor (or blend) is used in a particular contextual preference. They dispel this myth of a transcendent, Platonic math while validating a plurality of useful and accurate maths.
However Lakoff & Nunez do not see the above as relativistic postmodernism (pomo) because of empirically demonstrated, convergent scientific evidence of universal, embodied grounding of knowledge via image schema, basic categories and extended in metaphor. They see both transcendent math and pomo as a priori investments. And they also affirm universal validity, but through empirical methodology, not a priori speculation.
Lakoff (1987) also points out the following:
"The classical theory of categories provides a link between objectivist metaphysics and and set-theoretical models.... Objectivist metaphysics goes beyond the metaphysics of basic realism...[which] merely assumes that there is a reality of some sort.... It additionally assumes that reality is correctly and completely structured in a way that can be modeled by set-theoretic models" (159).
He argues that this arises from the correspondence-representation model, a model that has been legitimately questioned by postmetaphysical thinking.
Also see the above on the idealistic assumptions of modeling that came from a type of complexity theory that also assumed the universality of scale-free networks, while most actual networks do not display this kind of mathematical distribution.
A prime example of the power law religion is found in the model of hierarchical complexity (MHC). Commons (2008) admits the Platonic roots when he said: “The ideal truth is the mathematical forms of Platonic ideal.” Granted he qualified this statement noting the difference between the ideal and the real, that we cannot know the ideal as pure form, only as it manifests in the real. And yet he further notes that Aristotle elucidated the real with postulates of logic, yet these too come from a priori axioms without empirical grounding. Yes, the logical entailments of his logic follow mathematical rules, but the axioms are presupposed a priori and taken as given. The MHC then is a combination of the ideal “perfect form, as Plato would have described it,” with the representation of that form in the real domain. The duality of the ideal and the real is apparent.
Lakoff and Johnson (1999, Ch. 7) show that abstract set theory has no connection to embodiment:
“Spatial relations concepts (image schemas), which fit visual scenes, are not characterizable in terms of set-theoretical structures. Motor concepts (verbs of bodily movement), which fit the body's motor schemas, cannot be characterized by set-theoretical models. Set-theoretical models simply do not have the kind of structure needed to fit visual scenes or motor schemas, since all they have in them are abstract entities, sets of those entities, and sets of those sets. These models have no structure appropriate to embodied meaning-no motor schemas, no visual or imagistic mechanisms, and no metaphor."
However Lakoff and Nunez note that math per se is not merely socially constructed:
"In recognizing all the ways that mathematics makes use of cognitive universals and universal aspects of experience, the theory of embodied mathematics explicitly rejects any possible claim that mathematics is arbitrarily shaped by history and culture alone. Indeed, the embodiment of mathematics accounts for real properties of mathematics that a radical cultural relativism would deny or ignore: conceptual stability, stability of inference, precision, consistency, generalizability, discoverability, calculability, and real utility in describing the world" (362).
Nonetheless, the MHC used a particular kind of set theory were sets cannot be members of themselves but in other set theories they can:
“There are lots and lots of set theories, each defined by different axioms. You can construct a set theory in which the Continuum hypothesis is true and a set theory in which it is false. You can construct a set theory in which sets cannot be members of themselves and a set theory in which sets can be members of themselves. It is just a matter of which axioms you choose, and each collection of axioms defines a different subject matter. Yet each such subject matter is itself a viable and self-consistent form of mathematics. [...] There is no one true set theory" (WMCF, 355).
Commons, Ross, Miller (2010) note that Axiom 1 of the MHC is based on set theory and the orders are scale-free.
“Axiom 1 of the Model of Hierarchical Complexity (Commons, Goodheart, et al., 2008) posits that consistent with Piaget, that higher order actions are defined in terms of two or more lower-order actions. In terms of set theory, A = {a, b} where A is the higher order set, and a and b are lower order actions that are elements of that set A.. Note that the element a cannot equal the set A. An element cannot equal a set formed out of that element.”
Sara Ross (2014) goes further in that the MHC’s orders are scale-free and fractal.
"To possess 'universal, scale-free' properties means the MHC’s orders of hierarchical complexity are fractal. Fractal means the repetition of self-similar patterns at different scales. Behavioral scales from the micro-biological to large social systems evidence the orders of hierarchical complexity (see Commons & Ross, 2008). The fractal transition theory is proposed as a universal, scale-free general model as well.”
Here we are seeing the power law religion in action, given that the MHC’s orders are ideal and they develop from scale-free power laws. And yet as Broido and Clauset noted above, real networks rarely display scale-free power laws. They further noted that given the empirical data, different models would be required to explain these other networks, that the scale-free model, while perhaps applicable for a few real-world networks, was inadequate to the task.
One example of a real world network is the human brain connectome. Gastner and Odor (2016) note that the connectome is not scale-free, so why use scale-free models to measure it? And why then extend scale-free models to everything? As noted above, is it a power law religion of the ideal imposed on phenomena?
"The structural human connectome (i.e. the network of fiber connections in the brain) can be analyzed at ever finer spatial resolution thanks to advances in neuroimaging. Here we analyze several large data sets for the human brain network made available by the Open Connectome Project. We apply statistical model selection to characterize the degree distributions of graphs containing up to nodes and edges. A three-parameter generalized Weibull (also known as a stretched exponential) distribution is a good fit to most of the observed degree distributions. For almost all networks, simple power laws cannot fit the data, but in some cases there is statistical support for power laws with an exponential cutoff."
A recent neuroimaging study (Smith et al., 2019) on brain connectome hierarchical complexity (HC) seems to support my notion that, like basic categories in cognitive science, HC arises from the middle out as 'bridges' rather than bottom-up or top-down. E.g.
"Dividing the connectomes into four tiers based on degree magnitudes indicates that the most complex nodes are neither those with the highest nor lowest degrees but are instead found in the middle tiers. […] The most complex tier (Tier 3) involves regions believed to bridge high-order cognitive (Tier 1) and low-order sensorimotor processing (Tier 2)."
"The results show that hub nodes (Tier 1(t)) and peripheral nodes (Tier 4(b)) are contributing less to the greater complexity exhibited in the human brain connectome than middle tiers. In fact, this is particularly true of hub nodes."
Also note that "this concerns wholly separate considerations of topology to the well-known paradigms of small-world and scale-free complex networks," being one of those new models that responds to the empirical date rather than trying to fit the latter into a one-size-fits-all scale-free model.
This appears to be a matter of the guiding metaphors used in defining worldviews, which over time transcend and replace their forbears given new information in a different time and context (Berge, 2019). Sociological worldviews form a continuum in the broad categories of pre-modern, modern, postmodern and metamodern. The modern, scientific worldview is based on a mechanistic worldview, with later iterations extending that metaphor based on the computer. Both are premised on dualisms of various sorts, like the difference and separation of body and mind, ideal and real and with computers, on and off with one pole in the duality being source, the other it’s logical result. It’s an either/or logic of the Aristotelian type noted above, itself based on a priori axioms.
The postmodern metaphor turned this dualism around, claiming that the ideal was fantasy, that only the concrete, real world had validity, the ideal just being so much hierarchical power relations over the real without any basis. However, the metamodern worldview syntegrates this dualism by acknowledging duality, but also everywhere in between the poles. In fact, it quits thinking in terms of poles altogether, e.g. Abramson (2014):
"But a still more intriguing question is whether antipodal analyses are any longer useful, or whether the time has come to speak of multiple dimensions of reality, actualities that are irresolvably contradictory and deliberately incalculable, and a state of affective response in which contemporary humans feel perpetually overwhelmed, but not critically degeneratively so. Whereas postmodern theories of hyperreality invariably metaphorized erasure of the line between fact and fiction as a gradual process of degeneration, collapse, and decomposition, metamodernism approaches contradiction, paradox, and ambiguity as reconstructive forces, and emphasizes not singularity qua collapse but multiplicity qua transcendence. […] The question to be asked of and into contemporary culture, then, is [...] indeed a transcendent metamodern condition in which the poles themselves have disappeared and we, collectively and individually, have found in the middle space between them an entirely new site of 'reconstructive deconstruction'" (7-8).
This is consistent with Lakoff and Johnson’s cognitive science, a reconstruction of an empirical plurality of mathematics, allowing for their “contradiction and paradox,” yet grounded in our universal embodiment in the spaces between metaphysical paradigms.
Tags:
Views: 156
Chuck Pezeshki replied:
Power law religion arises out of the typical academic Authority-driven/Legalistic v-Meme, as a status assertion that one (that's the authority part) has found the single constant for the universe (that's the legalistic/absolutistic part.) Not surprised at all, esp. relating it all back to some version of Plato.
Geoffrey West goes nuts on this (he used to be the head of the Santa Fe Institute) in his book 'Scale'. Some useful stuff does come out of this obsession. The book is long, but his concepts of sublinear/superlinear re: energetic flows is good, and useful as far as understanding open vs. closed systems. Bottom line? Open systems can live on, but closed systems that restrict information flow (like companies) are, sooner or later, gonna die.
Obviously (well, at least to me) most real systems are what we call multi-fractal -- the 'fractal/self-similar' part constraining itself to whatever the dominant physics are at that level. A basic example? A drop of water is constrained by surface tension, hence that little bubble on your spoon, if it's just a drop. If it's a bunch of drops, otherwise known as a river, a different set of scaling physics overrides, and, well, the river flows and makes vortices and such. It's not that hard. Unless, of course, you want to be the dude/dudette getting your name on the magic #.
This all maps to Edward's worldview stuff -- different mechanism producing different scaling. Like a good professor, I'll leave it on the board for those still reading this and NOT asleep to finish the calcs. :-) And for those particularly inclined, you can ponder the meta-linear nature of the transformation about for MHC -- chained operators do not nonlinear jumps make.
I did like the stuff about the connectome spreading out from the middle scales to larger and smaller. Cool!
I responded:
I did mention multi-fractals to the Yahoo Adult Development group. E.g., from this [1] article:
"Multifractals are more highly advanced mathematical structures: fractals of fractals. They arise from fractals 'interwoven' with each other in an appropriate manner and in appropriate proportions. Multifractals are not simply the sum of fractals and cannot be divided to return back to their original components, because the way they weave is fractal in nature. The result is that in order to see a structure similar to the original, different portions of a multifractal need to expand at different rates. A multifractal is therefore non-linear in nature."
And from this paper[2]:
"Self-organized criticality (SOC) purports to build multi-scaled structures out of local interactions. Evidence of scaling in various domains of biology may be more generally understood to reflect multiplicative interactions weaving together many disparate scales. The self-similarity of power-law scaling entails homogeneity: fluctuations distribute themselves similarly across many spatial and temporal scales. However, this apparent homogeneity can be misleading, especially as it spans more scales. Reducing biological processes to one power-law relationship neglects rich cascade dynamics. We review recent research into multifractality in executive-function cognitive tasks and propose that scaling reflects not criticality but instead interactions across multiple scales and among fluctuations of multiple sizes."
Some of the references I've provided show that cognitive functioning, and natural phenomenon generally, operate via multifractal cascades, not linear, repeated, monofractal similarities. The latter are an imposition created by formal mathematics under the guise of ideal Platonic forms and/or ideal Aristotelian rules and categories. As much is admitted in this MHC paper (pp. 113-15) [3]. Even advanced maths [4] operate via such cascades and not formal necessary and sufficient conditions that fit into tidy, reiterated sets. It seems to me that any model of complexity should be based on how dynamic systems actually operate rather than trying to fit them into a formal ideal.
The crux of the issue [5]:
"Hierarchical organization is a cornerstone of complexity and multifractality constitutes its central quantifying concept. For model uniform cascades the corresponding singularity spectrum are symmetrical while those extracted from empirical data are often asymmetric."
[1] https://www.sciencedaily.com/rel.../2016/01/160121110913.htm
[2] https://www.frontiersin.org/.../10.../fphys.2012.00102/full
[3] https://www.dareassociation.org/.../GWOF_A_330277...
[4] https://en.wikipedia.org/wiki/Multifractal_system
[5] https://arxiv.org/pdf/1503.02405.pdf
I also reminded of this Mascolo quote:
"Although it is possible to identify particular tasks and activities that operate within particular domains of thinking, feeling, or acting in everyday life, most tasks involve an integration of multiple task domains. […] Higher-order skills emerge from the constructive differentiation and inter-coordination of skill elements from diverse task domains. […] Viewed in this way, it becomes clear that development takes place in a multidirectional web of pathways (Fischer and Bidell, 2006) rather than a unidirectional ladder. […] Developing skills do not move in a ﬁxed order of steps in a single direction, but they develop in multiple directions along multiple strands that weave in and out of each other in ontogenesis, the developmental history of the person (or other organism)" (336-37).
Also now an Integral World article here.
This morning I was re-reading a paper I co-authored with Michel Bauwens, which reminded me of the above discussion. From footnote 2 of that article:
Bryant (2011b) discusses how Bhaskar sees the difference between the transcendent and transcendental. The former assumes a metaphysical foundation for knowledge as described above. Transcendental deduction bypasses such a framing by speculating on what virtual preconditions must be supposed for knowledge to be possible. The virtual by this definition is multiple and immanent without any need of a transcendent, metaphysical underpinning. Bryant (2008) explores this in depth in another book about Deleuze.
Nobuhara (1998) asserts that for Hartshorne relative (r) terms are the basis of absolute (a) terms, noting: "As the concrete includes and exceeds the abstract." The ever-changing relative domain includes within itself the abstract absolute. He defines the absolute as supremely relative, or surrelative.
Another way of approaching the asymmetrical relationship between the relative and the absolute is through basic categories and image schema as elucidated by Lakoff (1999). Recall that these prototypes are in the middle of classical categorical hierarchies, between the most general and the most particular. Basic categories are the most concrete way we have of relating to and operating within the environment. Thus both the more particular and more general categories are more abstract. And yet our usual way of thinking is that the more particular the category the more concrete or relative the object it represents is and vice versa.
Which is indeed related to the absolute being asymmetrically dependent on the relative, if by relative we mean those concrete image schema which are the basis of more abstract derivations. It's easy to confuse them because our 'common sense' associates the more concrete objects of the world with the most particular objects on our constructed hierarchies; the same for the most abstract and ephemeral of thoughts, which do not seem physical or material. And yet these hierarchies are not constructed that way, instead being from the middle up and down via image schema and basic categories.
Such things are unconscious and not readily apparent. So of course we can 'reason' from both the bottom-up and top-down in such hierarchies if we associate the relative with the most particular and the absolute with the most general or abstract. But we do so from the most concrete of image schema, the actual relative, while the top and bottom of the usual, classical hierarchy are the most abstract.
This chart/image helps with some of the ideas above. E.g., #16 noting multiple scales and levels interacting, reminding me of Mascolo's quote above. Although #15 claims that "complex systems are often nested hierarchies." Which of course depends on what we mean by nested hierarchies. If by that it means fractals with scale-invariance, then that per above is rare, not often. Also note that the chart is similar to Edwards' chart of the categories of lenses.
From this piece, which feeds my thesis:
"[...] a paradox known as the continuum hypothesis. Gödel showed that the statement cannot be proved either true or false using standard mathematical language. [...It] efficiently boils down to a question in the theory of sets [...Cantor] was not able to prove this continuum hypothesis, and nor were many mathematicians and logicians who followed him."
"Gödel [...] showed that the continuum hypothesis cannot be proved either true or false starting from the standard axioms — the statements taken to be true — of the theory of sets, which are commonly taken as the foundation for all of mathematics. Gödel and Cohen’s work on the continuum hypothesis implies that there can exist parallel mathematical universes that are both compatible with standard mathematics — one in which the continuum hypothesis is added to the standard axioms and therefore declared to be true, and another in which it is declared false.
Nonlinearity in Living Systems is the title of a recently published e-book by Frontiers in Applied Mathematics and Statistics. Here's an excerpt from the introductory editorial.
"The biological basis of physiological signals is incredibly complex. While many researches certainly appreciate molecular, cellular and systems approaches to unravel overall biological complexity, in the recent decades the interest for mathematical and computational characterization of structural and functional basis underlying biological phenomena gain wide popularity among scientists.[...] We witnessed wide range applications of nonlinear quantitative analysis that produced measures such as fractal dimension, power law scaling, Hurst exponent, Lyapunov exponent, approximate entropy, sample entropy, Lempel–Ziv complexity as well as other metric. [...] Also there is another more theoretical challenge of contemporary nonlinear signal measurements, especially including fractal-based methods. The question of choosing the right method and its possible adjustment in order for the results of the analysis to be as accurate as possible is the persistent problem.[...] We seek to bring together the recent practical and theoretical advances in the development and application of nonlinear methods or narrower fractal-based methods for characterizing the complex physiological systems at multiple levels of organization. [...] A comprehensive understanding of advantages and disadvantages of each method, especially between its mathematical assumptions and real-world applicability, can help to find out what is at stake regarding the above aims and to direct us toward more fruitful application of nonlinear measures and statistics in physiology and biology in general."
Excerpts from this article in the ebook, "Measures and metrics of biological signals."
"With the growing complexity of the applied mathematical concepts, we are approaching some serious issues of foundations of Mathematics. Before that, let us mention that the symbol ∞ does not represent infinity uniquely since Cantor's discoveries in 1873, when he showed that arithmetical and geometric infinity, i.e., natural numbers and real line are different infinite quantities. As a consequence, infinity has been scaled in terms of pairwise different cardinal numbers. However, the size of this scale is enormous; it cannot be coded by any set. This was the creation of Set theory, and the beginning of the studies of foundations of Mathematics, which is probably never ending."
"We learned that Mathematical theories, packed around their axioms can be at the same level of logical certainty, while obviously impossible mixed together since with colliding axioms.[...] Let us just say that AC (Axiom of Choice) is very much needed in the foundations of Mathematics, but there are alternatives. [...] Some of the functions close to the above-examined fractals are complex enough to open the fundamental issue. [...] On the other hand, we can stay on the flat Earth and deal only with short approximation of the phenomena, avoiding entering the zone of the complex Mathematics and its fundamental issues. Yet, as proved by Goedel, we cannot escape the hot issues even remaining only in Arithmetic, nor in any theory containing its copy (like Geometry)."
Excerpts from The Number Sense by Dehaene, pp. 242-45 are below. I'd say whether or not one believes in Platonic math, both it and formal math are abstract with a priori axioms divorced from concrete reality. The intuitionist or constructivist math he notes below, while accepting our innate categories of thought, are not the same as the image schema and basic categories of cognitive linguistics (and in fact are not referenced). But it bases this similar idea on the relation of math to our embodiment.
"Twentieth-century mathematicians have been profoundly divided over this fundamental issue concerning the nature of mathematical objects. For some, traditionally labeled 'Platonists,' mathematical reality exists in an abstract plane, and its objects are as real as those of everyday life. [...] For an epistemologist, a neurobiologist, or a neuropsychologist, the Platonist position seems hard to defend—as unacceptable, in fact, as Cartesian dualism is as a scientific theory of the brain."
"A second category of mathematicians, the 'formalists,' view the issue of the existence of mathematical objects as meaningless and void. For them, mathematics is only a game in which one manipulates symbols according to precise formal rules. Mathematical objects such as numbers have no relation to reality: They are defined merely as a set of symbols that satisfy certain axioms. [...] Though the formalist position may account for the recent evolution of pure mathematics, it does not provide an adequate explanation of its origins."
"A third category of mathematicians is thus that of the 'intuitionists' or 'constructivists,' who believe that mathematical objects are nothing but constructions of the human mind. In their view, mathematics does not exist in the outside world, but only in the brain of the mathematician who invents it. [...] Among the available theories on the nature of mathematics, intuitionism seems to me to provide the best account of the relations between arithmetic and the human brain. The discoveries of the last few years in the psychology of arithmetic have brought new arguments to support the intuitionist view. [...] These empirical results tend to confirm Poincare's postulate that number belongs to the 'natural objects of thought,' the innate categories according to which we apprehend the world. [...] Intuition about numbers is thus anchored deep in our brain. Number appears as one of the fundamental dimensions according to which our nervous system parses the external world."
Power laws are a burr in my butt today. From this article:
"Many self-similar systems are scale invariant only in discrete steps. A blood vessel tends to branch into two smaller vessels, a fluid vortex into two or three smaller vortices, and the Sierpinski triangle is self-similar only by powers of two. These systems preserve relative proportions upon rescaling from one step to the next, but not upon arbitrary rescaling. This property is termed discrete-scale invariance or discrete renormalizability. It is a weaker condition than the continuous scale invariance underlying the Pareto distribution. Whereas strict scale invariance implies a power law and vice versa, discrete-scale invariance allows log-periodic modulations in the frequencies of observations that deviate from a pure power law such as Eq.(1). Such modulations are indeed observed in bronchial tube diameter, vortex ens-trophy, and financial asset prices."
From this article:
"A common graph mining task is community detection, which seeks an unsupervised decomposition of a network into groups based on statistical regularities in network connectivity. Although many such algorithms exist, community detection’s No Free Lunch theorem implies that no algorithm can be optimal across all inputs. [...] We find that (i) algorithms vary widely in the number and composition of communities they find, given the same input; (ii) algorithms can be clustered into distinct high-level groups based on similarities of their outputs on real-world networks; (iii) algorithmic differences induce wide variation in accuracy on link-based learning tasks; and, (iv) no algorithm is always the best at such tasks across all inputs."
At the moment, this site is at full membership capacity and we are not admitting new members. We are still getting new membership applications, however, so I am considering upgrading to the next level, which will allow for more members to join. In the meantime, all discussions are open for viewing and we hope you will read and enjoy the content here.
© 2024 Created by Balder. Powered by