A more mathematical approach?
- Brain Dumps

- Nov 8
- 6 min read
This is an interesting Brain Dump. One that required quite some time to produce and, eventually, turned out to indicate a different direction than the one initially envisioned. To be completely honest, this unexpected outcome carried some relief, because the direction initially hypothesized was a bit frightening.
I want to start by making a reference to "Science and complexity", originally published by Warren Weaver in 1948.
The concepts described here on Brain Dumps certainly did not come about after I read that paper. The paper, however, and other related material, constituted a very important piece of the puzzle, since it helped put things into context and perspective; it provided a "home" and a "language" for most of the thoughts presented in this blog. I think I got to that paper after hearing a lot about Complexity Science, but I cannot recall the exact circumstances that led me to read it.
I had the first thoughts in the direction of this Brain Dump on January 2, 2024. All was already pretty much centered around the concept of Entropy, since I opened my notes by writing: "there is more information in complexity because the entropy is higher; beware of the distinction here between organised and unorganised complexity as stated in the paper from Dr. Warren Weaver, Science and Complexity".
My notes started with some assumptions that I can now either confirm or correct. These assumptions stemmed from my lack of knowledge at that time, not from a general lack of knowledge on the topics. Other people, way smarter than I am, alreday clarified these points long ago; for me, it has been purely a matter of research.
Human systems are open (see also Confusion and Velocity)
For a spontaneous process, ∆Suniv > 0 (ref)
"The entropy of an isolated system during a process always increases" (ref)
Personal note: with those refined assumptions, I should probably revisit most of the past Brain Dumps, especially "Thoughts on Systems and Processes". On the other hand, the idea behind Brain Dumps is to move forward with new knowledge; the traversed path, including its errors, remains traceable by references, and the corrections, due to the better understanding that has emerged, will either converge into newer Brain Dumps or eventually put an "halt" on older Brain Dumps.
With the above assumptions, it seems I wanted to work on the following points (referred to later as working points):
Find the necessary conditions to keep the entropy constant (i.e., find a mechanism to “control” the amount of disorder)
Map the changes in the system's characteristics when entropy (and therefore novelty in information) increases
Determine the amount of energy needed to maintain the system in a harmonious-improving state
Driven by the above and, I admit, with the help of AI chatbots, I embarked on a research effort, making baby steps in an attempt to understand how to move forward. From time to time, I collected the things I found and that I thought aligned with the main idea of this Brain Dump.
The first step I captured was trying to clarify the conceptual link between Entropy as seen in Information Theory (introduced by Claude Shannon in 1948) and in Thermodynamics (introduced by Rudolf Clausius in 1865). At this stage, the following points were barely known to me, but apparently provided the first hints about the link:
The statistical definition of Entropy, as proposed by Ludwig Boltzmann in 1877 and later by Josiah Willard Gibbs (in 1902 ?), where macroscopical properties of a (thermodynamic) system are described in terms of large ensembles of microscopic states.
The Landauer's principle, proposed by Rolf Landauer first in 1961, which provided a direct relationship between Information Theory and Thermodynamics by stating that erasing one bit of information requires a minimum amount of energy that is proportional to the operational temperature of the system.
It did not take long to realise that there is much confusion around the concept of Entropy, not only in my Brain. On the other hand, however, there seems to be some alignment around the "essence" or "object" indicated by this quantity. In two papers from 1957, E. T. Jaynes argued that the entropy in statistical mechanics is conceptually similar to the information entropy (ref). Following the same path, I stepped onto a (yet again) different field, namely that of Quantum Computation and Quantum Information Theory. In a 1995 paper on Quantum Coding by Benjamin Schumacher (ref), I found that "in classical statistical mechanics, in fact, the statistical entropy is formally identically to the Shannon entropy. This has led to a considerable effort to give statistical mechanics an information-theoretic foundation". This provided further evidence on the conceptual link but seemed to suggest that such a link was "conceptually" living in the Quantum realm.
Quantum Theory is a field I have barely been involved with. This is not a problem in itself if there are necessary conditions to move in this direction. For the time being, I only have a couple of supporting points in this regard, both from Physicist David Deutsch:
Things took a slightly different direction when I read about the Substrate-Independence Theory. With particular focus on working point 3 above and the considerations from Landauer's principle that "information is physical" and that there exists a “coupling between the concept of information entropy to energy expenditure”, the Substrate-Independence Theory provided an even more recent (and possibly more appropriate) insight, in that “redundancy and mutual information highlight a couple of examples showing why interactions (exchanges of relevant information) are a better measure of energy in social systems than measuring ‘energy’ in computing power and kilowatt usage“. This new direction still carried some elements of "Information" but somehow required abandoning concepts that were somehow familiar to me.
Information Theory and, to some extent, Systems Thinking are fields I have been involved with longer. Telecommunications Engineering, the background I come from, builds entirely on Shannon's work; Systems Thinking is not something I directly studied, but I found out later it is a field that heavily influenced me. These disciplines, Information Theory and Systems Thinking, share a liminal region with Complexity Science, with he latter seemingly originating mostly from Physics and the 3-body problem (ref).
The Substrate-Independence Theory made a reference that I already touched on in "Thoughts on Systems and Processes" and "The Natural Flow of Energy", namely Contructor Theory. This theory, developed by David Deutsch and Chiara Marletto, "draws together ideas from diverse areas, including thermodynamics, statistical mechanics, information theory, and quantum computation“ (ref). To some extent, I also understood that Constructor Theory brings Epistemology into Physics.
When thinking about Brain Dumps, there seems to be a particular idea I am always very eager to attack, and that is the idea that there exists a way of achieving a perfectly "efficient" human system, whether at the individual or the team / organisational level. As of August 2025, I always sensed that this idea was based on a “mechanical” view of human systems; trying to find out where this thought came from turned out to be a very fertile ground for a process of continuous research. I am sure some other people share this perspective, at least in part. For example, Dr. Abby Innes, although from an economic perspective, shares that machine models of the economy “operate with closed system reasoning about the possibility of a completely efficient system“ (ref).
After these observations, however, I am wondering whether a more mathematical approach is worth pursuing. I still sense, after all, that there exists a different and more "natural" language to abstract and codify what Brain Dumps is about. Mathematics is essential for Natural Science, something Complexity Science builds heavily on. There does not seem to be space for Formal Science here because “in Rudolf Carnap's logical-positivist conception of the epistemology of science, theories belonging to formal sciences are understood to contain no synthetic statements“ (ref), and I am inclined to think that the meaning of all Brain Dumps is related to the world.
What I am left with at this point, reaching the conclusion of a Brain Dump that definitely stayed in my brain for far too long, is a list of topics and nomenclature for references. Trying to consider Information as a fundamental force (ref) and remembering that we know information is physical and can have causal power, in which case it is called knowledge (ref):
Human systems are open systems
Dynamic persistence, not much of the constituents, but of the functions.
Kolmogorov complexity, although not everything is computational in nature
Functional information as a metric or measure
Any attempt to move forward is probably better done in a Constructor Theoretic way, since there seems to be some conceptual space to explore things in a rigorous way while leaving any "entropic" consideration aside (ref).





