Before getting to the juice of this brain dump, I feel like I need to summarise, and probably revisit, a couple of thoughts I shared here and there in other posts. Also, I feel compelled to share a consideration that is probably worth keeping as a reference until the end of this brain dump, and potentially longer, until proven otherwise.
In the post “Confusion and velocity” there is a reference to the fact that “human systems are open”. For the sake of my brain dumps, I tend to consider systems as follows:
Open: they can exchange both energy and mass across their boundaries;
Closed: they can exchange energy but not mass across their boundaries;
Isolated: they do not exchange either energy or mass across their boundaries.
In that post, I hinted that "entropy - or state of disorder - remains stable only in a closed (isolated?) system". I think I can clear that question mark now because my understanding is that entropy can remain constant only in a theoretical scenario, where neither energy nor mass is exchanged (i.e. isolated system) and the processes occurring inside that system are perfectly reversible.
"Human systems are open" but sometimes I would argue that they can exhibit the behavior of a closed system, at least from an entropic point of view: until that specific system exists, the total entropy always increases. Also, although only theoretically, I would argue that sometimes human systems can be isolated, again from an entropic point of view: until that system exists, the entropy remains constant. As I hinted in the post “System dynamics and the strive for harmonious balance”, such a system would probably only lead to boredom and lack of purpose, since the energy is used solely to maintain the status quo. While writing this, I realized that there is much more, so I might come back to this concept in the future.
The consideration to be shared instead is the following. There seems to be some overlap between Complexity Theory and Systems Theory approaches because, sometimes, Systems Theory approaches, Cybernetics being one example, are used on open systems [1]. However, this might work only if the level of complexity in the system is relatively low.
After this long premise, the main part of this brain dump is the following. I became aware of a dynamic that gives me some hints about why sometimes my brain goes “too fast” and gets overwhelmed (cognitive overload) to the point that I need to dump thoughts. The dynamic is the following:
I am preparing mentally to enter a state where I can simply “enjoy the moment” for a bit and relax;
Right before entering this state, I am asked something that makes me think about what needs to come next;
My brain has to re-activate some cognitive workload to think of an answer. Most of the energy is probably spent on transitioning back into thinking mode.
If this dynamic is triggered too often and too fast, I quickly need to put in place some inhibitory mechanism and, even when I successfully manage to do so, all of this usually physically hurts afterward.
Triggered by this “epiphany”, I started to think about feedback loops and what cybernetics and system dynamics say about the “length” (time) of a feedback loop and what the impact on the system (or the environment, for that matter) is when the length is too short. This touches also on the frequency of the brain dumps on this blog as opposed to other platforms where “brain dumps" might happen more frequently. I hinted briefly in the post “Confusion and velocity” about the “speed” (frequency) of the brain dumps and how this might be looked at together with their “reach”, since both these dimensions affect the “length” (time) and “intensity” (energy) of the feedback.
Focusing only on the time of the feedback loops (alternatively, the frequency of the brain dumps) the following general considerations are relevant:
Short feedback loops (aka faster responses) might be beneficial in rapidly changing environments but if loops are too short they might lead to instability, if not adequate time is left for the system to adjust;
Shorter loops can also help create a regulatory mechanism, essential for maintaining a desired state. The effectiveness of such mechanisms, however, also depends on the accuracy and relevance of the feedback;
In terms of learning, and perhaps relevant for Complex Adaptive Systems (CAS), shorter feedback loops can enable faster learning and adaptation but only if the system can process, and act on, the received information without becoming overwhelmed;
Longer loops might mean more “stages” inside the loop: this means higher complexity and therefore higher risks;
Longer loops can help make the system more stable but at the cost of its responsiveness.
In the context of brain dumps, finding a good "length" (time) for a feedback loop involves balancing the response time with the capacity to process and react to new information. This trade-off is greatly influenced by the environment but a key “metric” to be considered is that “the system should not become overwhelmed”. As hinted again in the post “Confusion and velocity”, a necessary condition for this is to have relatively high awareness in the system. I think this is an extremely important consideration:
The higher the awareness, the greater the balance.
I spent enough energy on this brain dump and I know it can be expanded further; I hope nonetheless, that something good comes out of it.
References
[1] John Turner, Nigel Thurlow, Brian Rivera. “The Flow System: The Evolution of Agile and Lean Thinking in an Age of Complexity”. 2020.
Picture created by the author using GenAI