As I approach this 10th brain dump I am slowly realising that this blog, and the brain dumps I drop in here, are becoming more like a diary; a diary of a journey with, at present, only a vague feeling of direction. To my surprise, however, since the first "publicly" shared brain dump in April 2022, this journey is still fun and is getting more and more interesting.
Despite what a good friend of mine often says - and that is that "in life, we have to force it!" - I think that, just like it happens for water and electrical current, more energy flows where there is less resistance. There is still value in "forcing it", and some energy certainly goes this way, but I sense that what prevails at the end is the natural flow. Since I am convinced that more energy flows this way, I also genuinely believe that the outcome and the impact of one's effort are orders of magnitude greater when "gently" abandoning oneself to this approach. I say "gently" lightly but deep down I know that a lot of personal work and experience is needed to go peacefully this way. As a very good person once told me: "If you enjoy the climb, you might even get to realize that there is no peak".
This brain dump comes after I noticed that, in my posts, I often refer to the concept of 'entropy' when trying to describe certain experiences. To some extent, I admit that this does not fully surprise me; firstly because of my background in Telecommunications Engineering and, secondly, because I often try to look at things from a communication perspective, along with a systemic view.
What I think emerged in some of my posts is a nuanced intention to describe certain experiences by making analogies with the concept of 'entropy', tapping into both thermodynamics and information theory. I understand, however, that the concept of 'entropy' in these two fields is related to completely different things. In the first case, it is a direct description of the physical energy or heat of a system, whereas in the second case, it describes the average level of information. By googling a bit around, and getting some support from a friendly ChatGPT, I also understood that there seems to be a conceptual link between the two fields. To some extent, this link is given by Statistical Mechanics.
As a very high-level connection, I read that for Information Theory:
“Entropy in information theory is directly analogous to the entropy in statistical thermodynamics. The analogy results when the values of the random variable designate energies of microstates, so Gibbs formula for the entropy is formally identical to Shannon's formula.”
Similarly, in Statistical Mechanics, the postulate:
“The entropy as defined by Gibbs entropy formula matches with the entropy as defined in classical thermodynamics.“
This is the point where it becomes interesting for me since it seems I am directed to Constructor Theory from two angles:
Trying to find a conceptual link for 'entropy' in different fields and learning that Constructor Theory taps into diverse areas, precisely: thermodynamics, statistical mechanics, information theory, and quantum computation;
Notions from several areas I am also very interested in - Agile, Complexity Science, Systems Thinking - and in particular when reading this article (that I am still in the process of reading as I am writing this brain dump).
Since I started this brain dump from the concept of 'entropy', I am closing it with a note on a video I watched, where Dr. Chiara Marletto - one of the proponents of Constructor Theory at Oxford's University - answers a question from the audience exactly about what the new theory has to say on Information Theory as described by Shannon. According to Dr. Marletto, Constructor Theory plays a more fundamental role than Shannon's theory of Information because it aims to answer the following question: "Which properties of the physical laws allow for a physical system to contain information?" This seems to be of fundamental importance because some physicists believe that the set of such properties - those that allow for the existence of physical objects that contain information - constitute a much more robust set of properties than those given by the currently known dynamic laws of motion (note: plus initial conditions).
As I am reaching the end of this brain dump, I can only note that all I wrote here seems to "drive" my thinking in a specific direction. It is unclear to me, however, if this now means that the concepts I have expressed here need a more formal description. Perhaps one that is not necessarily expressed in mathematical terms, but would still need some sort of algebra or symbols? I cannot imagine at the moment that a natural language (English in this case) is the best to describe the concepts I often refer to. Time will tell.
Picture created by the author using GenAI