top of page

18 results found with an empty search

  • Thoughts on Systems and Processes

    Some time ago, I heard something that I thought was absolutely brilliant and gave me some serious food for thought. The thinking was something along these lines: A process designed for decision making should not be more complex than the technical system for which such decisions are to be made. Somehow, this resonated deeply but I knew it was not easy to explain. I was already conscious of the Problem of Induction , therefore, I did not attempt to generalise it. I was convinced, however, that it was still interesting to try to understand the concept better, even if “just” in the context in which it was mentioned. Back then, I hinted that a good starting point, perhaps, would have been the conceptual difference between a process and a system . Little did I know how this path could be challenging... One important remark to be made before moving forward. It might be tempting to call upon Ashby's Law of Requisite Variety , but we would risk forgetting that, in terms of applications, Control Theory and Cybernetics are likely valid only in Engineering and Mechanics, as the field of Anthro-complexity is showing (i.e., humans are not machines). There is another concept I feel like bringing into this brain dump at this point ( ref ): Many people in Systems Thinking say systems are defined by boundaries. In Complexity Science not all systems have boundaries and this means that the Second Law of Thermodynamics does not apply. This did not come as a surprise to me after all, even though I admit that after hearing it, I thought I needed to review pretty much everything I have been writing in my brain dumps. Not that I have a problem with changing things if they were not correctly expressed or if, given new evidence, they don’t make sense in my mind anymore. On the other hand, doing this would have required a new brain dump, and here we are. In “ System Dynamics and the Strive for Harmonious Balance ”, I referred to the Second Law of thermodynamics expressed in terms of spontaneous processes, knowing possibly very little about its meaning. This brain dump is probably an attempt in trying to "codify" what I know at this point, conscious that there can very well be fundamental gaps in my knowledge, and filling these does not necessarily require new concepts. What follows is a summary of some concepts I already expressed in " Impact of feedback loops on cognitive overload ", extended for the purpose of this post: System's Nature Characteristics Processes Thermodynamics Entropy (1) Information Entropy (2) Closed It can exchange energy but not mass across its boundaries with the environment Spontaneous Always increases Differential Entropy? Increases? Isolated It cannot exchange either energy or mass across its boundaries with the environment Perfectly reversible Constant Differential Entropy? Constant? Open It can exchange both energy and mass across its boundaries with the environment (3) Non-linear, path dependency, irreversible (hysteresis), retro-causality Gibbs Entropy? Differential Entropy? Notes (1) Thermodynamics Entropy is a direct description of the physical energy or heat in the system. (2) Information Entropy describes the system's average level of information, uncertainty or novelty. (3) What is meant by “open systems do not have boundaries”? Is the boundary absent and, in this case, one could claim that there are really no separate systems, or is it rather as described here that mass and energy can be exchanged across the boundaries? Concerning entropies, I have a couple of considerations lying around in my notes that I think are worth sharing: "In classical statistical mechanics, in fact, the statistical entropy is formally identically to the Shannon entropy" ( ref ) "Entropy in information theory is directly analogous to the entropy in statistical thermodynamics. The analogy results when the values of the random variable designate energies of microstates, so Gibbs' formula for the entropy is formally identical to Shannon's formula" ( ref ) Postulate - “The entropy as defined by the Gibbs entropy formula matches with the entropy as defined in classical thermodynamics“ ( ref ) Therefore: Classical thermodynamics entropy = (postulate) Gibbs entropy = (if values of random variable designate energy of micro-states) Shannon’s entropy (see also ref ) It is also worth keeping in mind something I already wrote in " The natural flow of energy ": Constructor Theory plays a more fundamental role than Shannon's theory of Information because it aims to answer the following question: "Which properties of the physical laws allow for a physical system to contain information?" Or, in different words, what is the set of those properties that allow for the existence of physical objects that contain information? Complementing the table above, what follows is another step in the mapping effort of what I currently think I know: System’s Behaviour or Evolution Characteristics Processes Energy Thermodynamics Entropy Information Entropy Static / Stable (3) Unchanging or changing in an uninteresting / predictable way Completely deterministic, known a priori, causality fully known, control feedback loops, “A stable System is one where the process is in statistical control” ( ref ) Energy is spent solely in maintaining a state of equilibrium / status quo. The Principle of Least Action can help find the stationary equilibrium points. What about the others (i.e., unstable point of equilibrium, kept stable by energy injection in the system)? Depends on the System’s Nature 0 (no novelty, all is known / predicted) Growing / Improving Development, Kaizen Dynamic, adaptive, positive feedback loops Energy gradients (Estuarine mapping?), micro-nudges Depends on the System’s Nature Differential Entropy? Degrading No coherent heterogeneity, reduced innovative capacity Dynamic, negative feedback loops Energy not spent or spent solely into the fast (i.e., little available time) functional equalisation of the system Depends on the System’s Nature Differential Entropy? Sustainable Capable of enduring for a specified and arbitrary long time (1) Scale invariant (?), non-ergodic (sustainable over which period?) (2) Energy gradients (Estuarine mapping?), micro-nudges, evolution as an energy minimisation process Depends on the System’s Nature Differential Entropy? Notes (1) Sustainability does not necessarily provide information on the trajectory: i.e., a static system could be seen as sustainable, a growing system too (if not upper bounded), a degrading system as well (if not lower bounded). (2) What is the definition of “Sustainable” exactly? One way I see it is that it is "that which can be kept in existence". (3) It is interesting to notice that Plato considered a non-degenerating system one that is in an ‘arrested state’ (see Karl Popper, Open Societies and its enemies , chapter 3, page 18), which I would categorise as static here. I did not consider mapping in my taxonomy an effort that is in an “arrested” state, since that would be of little interest and likely practically impossible. Curious that Plato did not see this in contrast with “ everything is in flux ”, if Heraclitus' principles influenced him (see " Theory of Forms or Ideas "). It is already challenging enough to get the first table right (I know, at the time of writing, it might not be correct), let alone the second one as well. The sketches below somehow help me abstract these concepts: I am not sure what the charts explain as of now: the evolution of the system's over time? When looking at all these charts, however, it seems I have to concede to two points I already know: In complex systems, there is constant novelty / emergence and, therefore, predicting the future state is not possible. Because of the above, it is better to monitor the current state and intervene only if necessary. I purposely left the sketches for the Sustainable behavior out, for reasons related to the notes of the second table. One final remark. In this brain dump, I am not merely trying to categorise what I know or adopt a reductionist approach to explanations. I am trying to represent and describe a concept (or multiple concepts?) for which I am seemingly missing the correct “language” but for which I am certain there is a proper and, perhaps, more formal and fundamental explanation. Picture created by the author using GenAI

  • A more mathematical approach?

    This is an interesting Brain Dump. One that required quite some time to produce and, eventually, turned out to indicate a different direction than the one initially envisioned. To be completely honest, this unexpected outcome carried some relief, because the direction initially hypothesized was a bit frightening. I want to start by making a reference to " Science and complexity ", originally published by Warren Weaver in 1948. The concepts described here on Brain Dumps certainly did not come about after I read that paper. The paper, however, and other related material, constituted a very important piece of the puzzle, since it helped put things into context and perspective; it provided a " home " and a " language " for most of the thoughts presented in this blog. I think I got to that paper after hearing a lot about Complexity Science, but I cannot recall the exact circumstances that led me to read it. I had the first thoughts in the direction of this Brain Dump on January 2, 2024. All was already pretty much centered around the concept of Entropy, since I opened my notes by writing: "there is more information in complexity because the entropy is higher; beware of the distinction here between organised and unorganised complexity as stated in the paper from Dr. Warren Weaver, Science and Complexity". My notes started with some assumptions that I can now either confirm or correct. These assumptions stemmed from my lack of knowledge at that time, not from a general lack of knowledge on the topics. Other people, way smarter than I am, alreday clarified these points long ago; for me, it has been purely a matter of research. Human systems are open (see also Confusion and Velocity ) For a spontaneous process, ∆Suniv > 0 ( ref ) "The entropy of an isolated system during a process always increases" ( ref ) Personal note: with those refined assumptions, I should probably revisit most of the past Brain Dumps, especially " Thoughts on Systems and Processes ". On the other hand, the idea behind Brain Dumps is to move forward with new knowledge; the traversed path, including its errors, remains traceable by references, and the corrections, due to the better understanding that has emerged, will either converge into newer Brain Dumps or eventually put an "halt" on older Brain Dumps. With the above assumptions, it seems I wanted to work on the following points (referred to later as working points): Find the necessary conditions to keep the entropy constant (i.e., find a mechanism to “control” the amount of disorder) Map the changes in the system's characteristics when entropy (and therefore novelty in information) increases Determine the amount of energy needed to maintain the system in a harmonious-improving state Driven by the above and, I admit, with the help of AI chatbots, I embarked on a research effort, making baby steps in an attempt to understand how to move forward. From time to time, I collected the things I found and that I thought aligned with the main idea of this Brain Dump. The first step I captured was trying to clarify the conceptual link between Entropy as seen in Information Theory (introduced by Claude Shannon in 1948) and in Thermodynamics (introduced by Rudolf Clausius in 1865). At this stage, the following points were barely known to me, but apparently provided the first hints about the link: The statistical definition of Entropy, as proposed by Ludwig Boltzmann in 1877 and later by Josiah Willard Gibbs (in 1902 ?), where macroscopical properties of a (thermodynamic) system are described in terms of large ensembles of microscopic states. The Landauer's principle, proposed by Rolf Landauer first in 1961, which provided a direct relationship between Information Theory and Thermodynamics by stating that erasing one bit of information requires a minimum amount of energy that is proportional to the operational temperature of the system. It did not take long to realise that there is much confusion around the concept of Entropy, not only in my Brain. On the other hand, however, there seems to be some alignment around the "essence" or "object" indicated by this quantity. In two papers from 1957, E. T. Jaynes argued that the entropy in statistical mechanics is conceptually similar to the information entropy ( ref ). Following the same path, I stepped onto a (yet again) different field, namely that of Quantum Computation and Quantum Information Theory. In a 1995 paper on Quantum Coding by Benjamin Schumacher ( ref ), I found that "in classical statistical mechanics, in fact, the statistical entropy is formally identically to the Shannon entropy. This has led to a considerable effort to give statistical mechanics an information-theoretic foundation". This provided further evidence on the conceptual link but seemed to suggest that such a link was "conceptually" living in the Quantum realm. Quantum Theory is a field I have barely been involved with. This is not a problem in itself if there are necessary conditions to move in this direction. For the time being, I only have a couple of supporting points in this regard, both from Physicist David Deutsch: The first one is that “by its inherent mathematics, quantum theory applies to things on all scales” ( ref ) The second one is about Bryce DeWitt's proof of the totalitarian property of quantum theory ( ref ). Things took a slightly different direction when I read about the Substrate-Independence Theory . With particular focus on working point 3 above and the considerations from Landauer's principle that "information is physical" and that there exists a “coupling between the concept of information entropy to energy expenditure”, the Substrate-Independence Theory provided an even more recent (and possibly more appropriate) insight, in that “redundancy and mutual information highlight a couple of examples showing why interactions (exchanges of relevant information) are a better measure of energy in social systems than measuring ‘energy’ in computing power and kilowatt usage“. This new direction still carried some elements of "Information" but somehow required abandoning concepts that were somehow familiar to me. Information Theory and, to some extent, Systems Thinking are fields I have been involved with longer. Telecommunications Engineering, the background I come from, builds entirely on Shannon's work; Systems Thinking is not something I directly studied, but I found out later it is a field that heavily influenced me. These disciplines, Information Theory and Systems Thinking, share a liminal region with Complexity Science, with he latter seemingly originating mostly from Physics and the 3-body problem ( ref ). The Substrate-Independence Theory made a reference that I already touched on in " Thoughts on Systems and Processes " and " The Natural Flow of Energy ", namely Contructor Theory. This theory, developed by David Deutsch and Chiara Marletto, "draws together ideas from diverse areas, including thermodynamics, statistical mechanics, information theory, and quantum computation“ ( ref ). To some extent, I also understood that Constructor Theory brings Epistemology into Physics. When thinking about Brain Dumps, there seems to be a particular idea I am always very eager to attack, and that is the idea that there exists a way of achieving a perfectly "efficient" human system, whether at the individual or the team / organisational level. As of August 2025, I always sensed that this idea was based on a “mechanical” view of human systems; trying to find out where this thought came from turned out to be a very fertile ground for a process of continuous research. I am sure some other people share this perspective, at least in part. For example, Dr. Abby Innes, although from an economic perspective, shares that machine models of the economy “operate with closed system reasoning about the possibility of a completely efficient system“ ( ref ). After these observations, however, I am wondering whether a more mathematical approach is worth pursuing. I still sense, after all, that there exists a different and more "natural" language to abstract and codify what Brain Dumps is about. Mathematics is essential for Natural Science, something Complexity Science builds heavily on. There does not seem to be space for Formal Science here because “in Rudolf Carnap's logical-positivist conception of the epistemology of science, theories belonging to formal sciences are understood to contain no synthetic statements“ ( ref ), and I am inclined to think that the meaning of all Brain Dumps is related to the world. What I am left with at this point, reaching the conclusion of a Brain Dump that definitely stayed in my brain for far too long, is a list of topics and nomenclature for references. Trying to consider Information as a fundamental force ( ref ) and remembering that we know information is physical and can have causal power, in which case it is called knowledge ( ref ): Human systems are open systems Dynamic persistence, not much of the constituents, but of the functions. Kolmogorov complexity, although not everything is computational in nature Functional information as a metric or measure Any attempt to move forward is probably better done in a Constructor Theoretic way, since there seems to be some conceptual space to explore things in a rigorous way while leaving any "entropic" consideration aside ( ref ). Picture created by the author using GenAI

  • Modern philosophy perspective on the evolution of thought

    Yesterday I was out having dinner with some friends. Every time I am with people whom I trust I always take the opportunity to share the good and the bad things that happen in my life. Somehow, sharing the bad things or, better, the things that left scars in my life (aka the bigger lessons), causes people to either judge me or give me recommendations. I always expected that opening up would generate empathy and a mutual willingness to open up on the other side, but this is not the case: I have such naive expectations. The "life quote" I received yesterday was " homo faber fortunae suae ", wrongly attributed by one of my friends to Marcus Aurelius, the Roman emperor ( Wikipedia attributes it to Appio Claudio Cieco). Leaving aside that this implies that you, and your choices, are the cause of your problems, it made me realize how dangerous it might be to blindly refer today to old philosophical beliefs, when philosophy, just like any other aspect of our lives, has evolved. I am not a philosopher - although I feel this text could fall into a "modern philosophy" bucket - nor can I cite people who researched and published in this field. The only thing that sparked into my mind was that such a "life quote", in the way it was conveyed, would completely disregard any knowledge advancement that has been done, just as one example, in systems thinking . Unless you are fully isolated from other people - in which case, the environment would still influence you - how can one ignore that your decisions are yours only but are heavily affected by the context you live in? I am on a path of inner search but I just wanted to leave this brain dump here. Picture created by the author using GenAI

  • System dynamics and the strive for harmonious balance

    A closed system left in its state will not maintain the status quo. This has been something hard for me to understand. It's hard to accept that even when a "harmonious" state is reached it cannot last without any regulatory mechanisms. Well, except for static systems, which are too simple for the context where I'm applying this thinking. When it comes to relationships, of any sort, static systems are probably not practical nor desirable, as this would result in energy spent just to maintain the status quo that leads nowhere, if not to boredom and lack of purpose. On the other hand, not investing any energy would cause the system dynamics to degrade to a chaotic state; this is something described also by entropy. In physics, the total entropy of a system - simply put how much disorder there is - either increases or remains constant in any spontaneous process; it never decreases. A different way of expressing the second law of thermodynamics is that in a closed system, the total entropy always increases. If we accept the above, there has to be some degree of energy spent by all parts of a system, even if only to maintain the status quo. Ideally, however, it would be great if we could all agree that a Kaizen mindset (i.e. continuous improvement) is beneficial as this would set the system on a path for increased quality. I want to keep this general for the moment, leaving aside discussions on what quality means and how it is measured. For the sake of this discussion, if we reduce the system in the context of relationships to a system comprised of two persons, we could argue that the amount of energy should be equally spent by each person. This is an interesting idea but if we accept that in such a context a static system is not desirable, it's easy to see that this 50/50 rule is, in theory, a valid option but it is, in my opinion, an enormous oversimplification. Based on my experience this threshold is dynamic and rightfully so. There are several reasons why this is the case. I have my idea, and this is something I'm also working on myself, as to what should be a good regulatory mechanism that maintains the system in an improving-harmonious state. This topic requires however a new post and, even then, I'm sure it is probably not going to be enough. Picture created by the author using GenAI

  • Relevance of system decomposition overhead

    I often thought about the " divide et impera " (Latin for "divide and rule") strategy. Originally, I think this concept has been used to describe how ancient empires ruled and I remember it being associated with government forms during the Roman Empire. When I try to generalize this concept, or at least try to look at it under a systemic lens, it is interesting for me to notice that this way of thinking seems, more often than not, still very much present in today's general way of thinking; and I understand all the reasons why it is probably so. One note: I read the " Divide and rule " Wikipedia page and the connotation given there is anything but positive since it is associated with forms of authoritarianism. Although it might seem obvious that complex systems cannot be managed efficiently and effectively in a centralized way, it is important to invest some energy in understanding how the system is being broken down into different parts in the first place. I would love to see if there is scientific research - if you know of any, feel free to drop a comment - that indicates how much division and depth, given a certain context, is "good enough" to decompose a system coherently and efficiently, and what kind of methodology can help find such a threshold. In my understanding, if this "dimension" (i.e. the complexity of the system or process that implements the solution) is not addressed properly, the risk of over-structuring or under-structuring the system is so that: in the best scenario, does not allow the solution to either up- or down-scale when needed; in the worst scenario, makes the system so inefficient that it will not serve the purpose for what the system was thought of in the first place, that is implement the desired solution. What I think is being neglected, while setting up systems and processes for a solution, is that dividing a system into sub-systems carries an intrinsic overhead, given by the dependencies between all the parts. I would even push myself to claim that the cost of this overhead, which I think has at its core a communication cost, is probably the main factor contributing to the overall system's effectiveness. I guess what I am after is how much of this sort of System Decomposition Overhead (SDO) is acceptable, given a specific context, for such a system to be as good as possible. Another concept that I think is relevant is the adaptability of the system itself, not necessarily for scaling purposes, but even "just" to allow iterative re-assessments of the solution. Even if we assume that the context does not change, as time goes by what evolves is the understanding of the problem and of the solution itself for which the system was put in place. In my view, this is enough of a reason to justify the existence of an adaptive process meant solely to control and improve the system. I don't know if I'm going anywhere with this post. It feels that generalizing the process - or what I think I am trying to generalize - would require tapping into disciplines in which I have no expertise. On the other hand, perhaps, this level of generalization ends up exactly in a discipline that is so abstract that almost feels like a "cult" (philosophy? First-principles thinking?), at least until a more formal description is available. I guess, I just felt I needed to do such a brain dump. Picture created by the author using GenAI

  • The importance of flow

    I was in Munich a few weeks ago to attend a running event. The event was not just a run: many obstacles, of different types, were placed along the path and participants had to surpass them. I loved the spirit of the day which was, at least to me and by the suggestions of the organizers, a non-competitive one. I still think competition is important in some contexts but this event aimed at making participants reach the end line as a "team". For some obstacles it was nice to see how people from different groups would help each other out; the only way to get past such obstacles. After the run, during a conversation with a friend the topic of "flow" briefly came up and I was surprised to see that he was not familiar with the concept. To me, flow is pretty much everywhere I go and I try always my best to achieve it. This period of my life has been pretty rough, given the pandemic and all, and it hindered my ability to reach a state of flow. I am sure, however, that this mindset will come back soon. Anyway, when I asked my friend whether he ever experienced a state of flow he was not sure about it. Most certainly, "flow" has different meanings to different people. The first time I experienced such a state, and I was aware of it, was during swimming. I would usually go swimming two or three times a week and most of the time, after about half an hour, I was entering this state where body and mind were perfectly synchronized. I could feel the motion of my body through the water, the breathing synchronized with the movement, and my mind in a peaceful and blissful state. In this state, I was completely aware of what was going on around me but somehow my mind was locked in the present and not distracted by any big or small thought that would come up. I am sure others experience such a state in different situations - meditation comes to mind - but to me, a state of flow always involves some kind of dynamism and motion. In my professional experience, I also found several references to the importance of flow. The Kanban methodology for managing work, as one example, seems to be a way of improving flow in a system. When thinking in terms of systems, as another example, I was fascinated when I found out how Little's Law can help model a flow-related concept that, at least to me, seemed pretty counterintuitive in the first place. If there is one thing I learned about reaching a state of flow is that it is damn hard to get to that point. It takes a big personal effort to get there and, even more so, to maintain it. On the other hand, I can see the benefits of such a state. Considering how difficult it is to reach such a state individually, it is not hard for me to see why reaching a state of flow in more complex systems is close to impossible. I also start perhaps to grasp why, in complex systems, any kind of modeling, if necessary, can only follow, and not anticipate, a more effective strategy based on defining values and principles first. Picture created by the author using GenAI

  • Processing of sensory information

    A few days ago I had this question popping up in my head: is it possible that awareness of external situations (i.e. stimuli) happens faster than of internal ones? A bit of context, without details, because I don't want to make this too personal. I often struggle to react with anger as a healthy means to establish boundaries. I am not talking about rage. Rather, I am talking about a healthy amount of anger that should keep something or somebody, that is pushing too much, out of the boundaries that I need to establish for my well-being: this is my choice and a basic need. The reason I am giving myself for not being able to do so is that my awareness of another person's fear or sadness comes before the awareness I have of my feelings. As naive as this might sound, this mechanism makes my empathy kick in before I realize that I don't have to necessarily let everything affect me. In hindsight, I think not allowing everything to touch us deeply can be healthy. In such situations, however, I often find myself apologizing for having opened up instead of simply saying - internal dialog here - "No, I'm not taking this". Another way to describe this could be the following. Often, I struggle to perceive my own emotions in response to something. Again, differently, the awareness of my own emotions comes in a second moment, compared to that of someone else's emotions. Is it possible, therefore, that there is some sort of "information" that travels faster than communication? I am not thinking here about verbal and non-verbal communication, that's easy. Any kind of message our brain processes still comes from sensory information. Even traditional non-verbal communication (for example body posture or facial expressions) is being processed by our eyes. A little digression: I never thought of it this way, but during face-to-face communication, since light travels faster than sound, there is a, very small, delay between the information our eyes and our ears have to process. I don't know how to frame this - well, this is a sentence I have used already a couple of times in other posts. I guess what I am wondering is something in the direction of: is it possible that when two human beings interact and are deeply connected they can exchange some sort of information without using any conventional form of communication? Even assuming no processing of sensory information is possible? Has anyone ever talked about "emotional waves"? The above sounds a lot like sci-fi but I find it interesting nonetheless to explore it. While I was researching the topic I found the following article from the Queen Mary University of London that seems like a fun experiment: " Scientists propose new way to detect emotions using wireless signals ". Edit. Coming back to this post after some time, I read again the question: Is it possible that awareness of external situations (i.e. stimuli) happens faster than for internal ones? and more thoughts hit me. If awareness is to some extent driven by perception (i.e. sensory information) I cannot help but notice that external situations trigger, ideally, all our senses. In contrast, internal situations leave, at least, visual cues out. Also, on the question: Is it possible, therefore, that there is some sort of "information" that travels faster than communication? Quantum mechanics makes, to some extent, a case for information traveling faster than light when particles are entangled. The following article could be interesting: " The Relationship Between Quantum Mechanics and Human Consciousness ". In this context, however, it seems relevant to remember the No-Communication Theorem . I am not entirely sure whether this is something that is going to be overseeded in future reviews, but it seems to me that the reasoning made in this post about sensory perception is in line with (some parts of?) a Cartesian philosophical framework, although I am relatively certain that its body-mind dualism does not fit here. Picture created by the author using GenAI

  • Epistemology and uncertainty in physical properties

    I was on LinkedIn today and saw this post from Cassie. When I read 'epistemology' I got excited! There is this big hype now about AI and related tools and once in a while I also play a bit with them. The following is an excerpt of a longer "conversation" with ChatGPT(3). Unfortunately, after my last message, the tool was not able to give me an answer so after a couple of times I gave up. Have fun reading the conversation! Brain : If we do accept that there is inherent uncertainty in physical properties, we should accept that there isn't an objective truth, as there is always uncertainty also on data, if we base our judgment on data. GPT-3 : The presence of uncertainty in physical properties and the limitations of our ability to measure them do not necessarily imply that there is no objective truth. While it is true that there may be uncertainty in the data we collect, this does not necessarily mean that there is no objective reality or that all knowledge is inherently uncertain. In fact, the scientific method is specifically designed to account for and minimize uncertainty in our measurements and observations. Scientists strive to minimize the sources of error and uncertainty in their experiments and observations, using rigorous methods and statistical analyses to quantify the uncertainty in their results. Moreover, even if there is some uncertainty in our measurements, it does not necessarily mean that there is no objective reality. It is possible to make objective statements about the likelihood of certain outcomes or the distribution of values, even if we cannot precisely determine the exact value of a physical property. It is also worth noting that the presence of uncertainty does not necessarily undermine the value or usefulness of knowledge. While it is true that we may never be able to achieve absolute certainty, we can still make reliable predictions and decisions based on the best available evidence and the most rigorous methods of inquiry. Ultimately, the goal of science and other forms of knowledge-seeking is not to achieve absolute certainty, but rather to build a reliable and useful body of knowledge that can inform our understanding of the world and guide our actions. Brain : If we do accept that perception is part of knowledge then it means that also self-awareness is relevant in the process of pursuing knowledge. Self-awareness is, however, very personal, not every individual reaches the same level of self-awareness. If one has then different awareness from others about an event, wouldn't this be an argument against the existence of an objective reality? GPT-3 : Perception is indeed a crucial aspect of knowledge acquisition, and self-awareness can play an important role in shaping our perception of reality. However, the fact that individuals may have different levels of self-awareness or subjective experiences does not necessarily mean that objective reality does not exist. Objective reality refers to the existence of facts and events that are independent of our perception or interpretation of them. While our subjective experiences and awareness may influence how we perceive and interpret reality, the underlying facts and events still exist regardless of our perception of them. For example, two people may have different subjective experiences of the same event, but the objective reality of the event (such as the time, location, and physical properties of the event) remains the same. Similarly, scientific experiments can be designed to minimize subjective bias and provide objective evidence for certain phenomena. In conclusion, while self-awareness can influence our perception of reality and therefore our knowledge acquisition, it does not necessarily imply that objective reality does not exist. Objective reality can still exist independently of our subjective experiences and awareness of it. Brain : This is true as long as we are speaking of objective reality in terms of something that can be described with one or more physical properties. Reality, however, must have a meaning for someone, a meaning for people. If that's not the case the discussion would be pointless, as neither you nor I would be having this conversation. If reality has to hold a meaning for someone, it means it's based on the interaction. Since different people have different levels of self-awareness they will perceive an experience in different ways. If we accept that certain experiences cannot be described fully (yet) in terms of physical properties (think of feelings, emotions) we should also accept that there is no objective truth, at least as long as we are not able to describe all the physical properties involved in that experience. But even in that case, knowledge is a process, and one property of processes is that of emergence; there will always be something that emerges, new knowledge. Hence, knowledge is never complete. This implies that one can only pursue truth and that, possibly, there isn't a single truth, if more people interact with it at a given point in time. Picture created by the author using GenAI

  • Perception, awareness and consciousness

    A few days ago I was having lunch with some colleagues and the conversation took a fascinating turn. One person in particular was sharing some thoughts, mostly a one-way communication with the group. At times, only one of the other people was engaging. The conversation was already quite deep; people were sharing some personal experiences. I decided to step into the conversation by asking “What makes an individual unique? What makes an individual THAT particular individual?” I knew this was not an easy one but I’m always curious to pick the collective “brain” of the group on such things. Although we, as human beings, share some experiences in our lives (the human condition?) I think it is important to understand what makes an individual unique. I threw the question in the round not expecting anything and I was surprised to see that no one entirely grasped the depth of such a question. The most compelling answer I received was that “choices” define an individual. I certainly agree to some extent. The choices we make, even the smallest everyday choices, define our lives and what our experiences will be; we actually “create” our lives through the choices we make. Still, I think that the choices we make are a consequence of what we experience or, better, perceive. The context in which we live (external stimuli in this case) greatly influences our choices. This kind of context, however, is an external factor and we get influenced by it only to the level that we can perceive. See it this way: something might happen around me but if I don’t perceive it at all, and I am oblivious to it, would this influence any decision I take? I doubt it. Our perception dictates what we “allow in” from the surrounding context we are immersed in. This sensory data constitutes information that drives our thoughts - thoughts are generated and emerge - and this helps us make decisions. Some people might need more information from senses and perception (rational process?) before making a decision; others might make decisions with less sensory information (emotional process?). I am not sure there is such a hard dichotomy between the two processes and, to some extent, I think we all balance the two approaches when it comes to decision-making. Yet, I think the common denominator here is “senses and perception”. Perception is tightly coupled with awareness - here we are again, back to awareness, why am I not surprised? - and awareness is a necessary condition for consciousness. As I’m writing this, I realized the last paragraph alone could be expanded even further; for the moment I just think that ‘consciousnesses’ is what makes an individual unique. Edit: for some reason that I cannot fully explain, this brain dump has been particularly significant in my journey with this website. That is why, in April 2024, I decided to ask an artist to paint on canvas the cover picture of this post, so that I could hang it on a wall - not everything needs to remain digital only. At the end of May 2025, the work was completed; a photograph of the result is attached here and, along with it, I am leaving this thought: What defines creativity? Is it a closed process, with a beginning and an end? If so, who owns the input? Or is it rather, as any part of an experience, a property of the interactions? Photograph of a canvas (acrylic on wood) painted by a human using as input the AI-generated image Picture created by the author using GenAI

  • Confusion and velocity

    I think it is a common misconception that chaos can only be a negative state. In my experience, most innovative ideas are born in such a state. Frequently, chaos is also associated with “creativity”. What I think is relevant is how long one persists in a chaotic state. I assume there is always an intention - or at least a “natural” willingness - to navigate back to a less chaotic state. To some extent, I think this can also be described by entropy, to which I very quickly hinted in the post " System dynamics and the strive for harmonious balance ". The entropy - or state of disorder - remains stable only in a closed (isolated?) system, where no energy is exchanged across its boundaries (" human systems are open "). This concept alone could be further expanded but I will leave it here for the moment. Part of this discussion can be certainly framed within the Cynefin framework. In (one of) its latest versions, the framework explicitly differentiates a state of “confusion & aporia”. If the persistence in a state of chaos is to be carefully evaluated, this is even more the case in a state of confusion. Although there might not be a framework (at least that I am aware of) for decision-making in a state of confusion, in my experience “brain dumping” is a valuable tool. This is true, regardless of the situation. However, some considerations need to be made: It is beneficial to already have a process in place for retrieving “brain dumps” if they were made in the past and they need to be re-accessed or re-evaluated; It is beneficial to decide at least on the “reach” of the “brain dump”; to be noted that the sooner it is shared the earlier feedback loops are potentially triggered and this can affect the speed of a change process; If a complex system is already in a state of confusion, adding “brain dumps” might not drive the system out of this state unless enough "high" awareness is present in (at least some?) parts of the system so that there is a clear understanding as to the boundaries of “brain dumps” that can be further made. On a closing note. I am Italian and even trying to come up with a title for this brain dump has been an interesting experience, since it seems that in Italian there are no different words for speed (the scalar) and velocity (the vector). I think this is, yet another, example of why at some point language is not beneficial anymore (too much time would be invested in debating the language itself) and diagrams or images are better and necessary: a vector is represented differently from a scalar because the vector is a bi-dimensional construct. Everyone is certainly aware of the saying “ A picture is worth a thousand words “. Picture created by the author using GenAI

  • The natural flow of energy

    As I approach this 10th brain dump I am slowly realising that this blog, and the brain dumps I drop in here, are becoming more like a diary; a diary of a journey with, at present, only a vague feeling of direction. To my surprise, however, since the first "publicly" shared brain dump in April 2022, this journey is still fun and is getting more and more interesting. Despite what a good friend of mine often says - and that is that " in life, we have to force it! " - I think that, just like it happens for water and electrical current, more energy flows where there is less resistance. There is still value in "forcing it", and some energy certainly goes this way, but I sense that what prevails at the end is the natural flow. Since I am convinced that more energy flows this way, I also genuinely believe that the outcome and the impact of one's effort are orders of magnitude greater when "gently" abandoning oneself to this approach. I say "gently" lightly but deep down I know that a lot of personal work and experience is needed to go peacefully this way. As a very good person once told me: " If you enjoy the climb, you might even get to realize that there is no peak ". This brain dump comes after I noticed that, in my posts, I often refer to the concept of 'entropy' when trying to describe certain experiences. To some extent, I admit that this does not fully surprise me; firstly because of my background in Telecommunications Engineering and, secondly, because I often try to look at things from a communication perspective, along with a systemic view. What I think emerged in some of my posts is a nuanced intention to describe certain experiences by making analogies with the concept of 'entropy', tapping into both thermodynamics and information theory. I understand, however, that the concept of 'entropy' in these two fields is related to completely different things. In the first case, it is a direct description of the physical energy or heat of a system, whereas in the second case, it describes the average level of information. By googling a bit around, and getting some support from a friendly ChatGPT, I also understood that there seems to be a conceptual link between the two fields. To some extent, this link is given by Statistical Mechanics. As a very high-level connection, I read that for Information Theory : “Entropy in information theory is directly analogous to the entropy in statistical thermodynamics. The analogy results when the values of the random variable designate energies of microstates, so Gibbs formula for the entropy is formally identical to Shannon's formula.” Similarly, in Statistical Mechanics , the postulate: “The entropy as defined by Gibbs entropy formula matches with the entropy as defined in classical thermodynamics.“ This is the point where it becomes interesting for me since it seems I am directed to Constructor Theory from two angles: Trying to find a conceptual link for 'entropy' in different fields and learning that Constructor Theory taps into diverse areas, precisely: thermodynamics, statistical mechanics, information theory, and quantum computation; Notions from several areas I am also very interested in - Agile, Complexity Science, Systems Thinking - and in particular when reading this article (that I am still in the process of reading as I am writing this brain dump). Since I started this brain dump from the concept of 'entropy', I am closing it with a note on a video I watched, where Dr. Chiara Marletto - one of the proponents of Constructor Theory at Oxford's University - answers a question from the audience exactly about what the new theory has to say on Information Theory as described by Shannon. According to Dr. Marletto, Constructor Theory plays a more fundamental role than Shannon's theory of Information because it aims to answer the following question: " Which properties of the physical laws allow for a physical system to contain information? " This seems to be of fundamental importance because some physicists believe that the set of such properties - those that allow for the existence of physical objects that contain information - constitute a much more robust set of properties than those given by the currently known dynamic laws of motion (note: plus initial conditions). As I am reaching the end of this brain dump, I can only note that all I wrote here seems to "drive" my thinking in a specific direction. It is unclear to me, however, if this now means that the concepts I have expressed here need a more formal description. Perhaps one that is not necessarily expressed in mathematical terms, but would still need some sort of algebra or symbols? I cannot imagine at the moment that a natural language (English in this case) is the best to describe the concepts I often refer to. Time will tell. Picture created by the author using GenAI

  • The Search for Identity

    I have been pondering about this brain dump for so long now that I might as well try to enable some flow by dumping what I have, whatever that is; I might go back to this in the future perhaps. Also, in some way, this is the first brain dump where I try to expand on direct feedback I received a long time ago. This feedback has been sitting in my head for so long now that it is not clear to me whether the focus of this post is driven by that feedback or by the underlying concepts that I always wanted to write about. In the post “ Perception, awareness and consciousness ” I wrote that it was my understanding that 'consciousnesses' is what makes an individual unique. In that post, I also wrote that awareness is a necessary condition for consciousness. I am not entirely sure of that anymore and the reason for that is because of "dissociation", the detachment from physical reality. I am not thinking here in terms of pathological dissociation, potentially triggered by PTSD. Many things in nature and life are on a continuum so I am taking a more subtle stance on this, to "experiment" with some reasoning. To some extent, many of us might experience mild dissociation in daily experiences. One example. I am driving to work and because I am very familiar with the route I am in "auto-pilot" mode: my mind starts to wander and for a brief moment I get detached from what is happening around me, only to jump back in when necessary, and hopefully not too late. Am I aware? Probably not entirely, since it takes a fraction of a second to return to reality. I would claim, however, that I am conscious and this would invalidate awareness as a necessary condition for consciousness. What happens during sleep? Eyes are closed, muscles are relaxed and brain activity is greatly altered. Awareness, which is tightly coupled with perception (i.e. sensory information), is reduced. One person can still be awakened by external stimuli, like a loud sound, although this takes longer than the previous example (mild dissociation when driving). When sleep progresses into deeper phases, consciousness is practically suspended. Does this mean that a sleeping person stops being that particular individual? Most certainly not. It seems, therefore, that I need to revisit also my claim that 'consciousnesses’ is what makes an individual unique. Since I am not a native English speaker, it is good for me to go back sometimes to the definitions of words. According to Google, which builds on top of Oxford Languages: Awareness - Knowledge or perception of a situation or fact. and Consciousnesses - The state of being aware of and responsive to one's surroundings. Just as an experiment, if I had to play around with definitions I would write them as: Awareness - Perception of physical reality. and Consciousnesses - The state of being aware of, and interacting with, physical reality. I am aware (pun intended) that I made quite some major changes in those definitions. Firstly, I left knowledge out of the definition of awareness because I think that, in some way, that is already covered. Empiricism claims that all knowledge comes from sensory information (i.e. perception). Rationalism claims that knowledge comes from thinking. In the same post I referenced at the beginning of this brain dump, I wrote that thoughts usually emerge. Can thoughts emerge without perception (i.e. senses)? To some extent, probably yes, but only if some memory is already present. And how was memory created in the first place? This is a question that can easily take another brain dump by itself. Secondly, in the definition of consciousness, I changed " being responsive " to " interacting ". Although I am not a native English speaker, to me the concept of "responsiveness" intrinsically implies somewhat of a "passive" behavior and I don't think that a conscious human being can ever be passive to physical reality. One always interacts with what we define as physical reality and, even if one isolates oneself, there is always some interaction with the environment: deciding not to communicate at all conveys a message as well. At this point, I might as well integrate the feedback I received some time ago, and that is that "consciousness could be seen as the rationalization of awareness". This is a very interesting idea and, if I had to take a reductionist approach, considering what I wrote in this last paragraph, I would say that: Consciousness is a state that emerges from perception. As I am looking for a way to close this brain dump, I cannot help but go back to the point that was made in the post “ Perception, awareness and consciousness ”: choices are what defines an individual. While I agree that this is true, I still think there is something more fundamental that defines an individual. For the moment, my experience brings me to think that "needs" and "desires" are deep values that define an individual (identity). I don't know what will follow from this post but I will close with a quote from the Welsh-born poet George Herbert – “ To him that will, ways are not wanting ”, commonly known in the form of the modern saying " Where there is a will there is a way ". Edit. Adding to this post: "One of the few places that you can really have a bit of a handle on what consciousness is, is what turns it off. And when you are thinking about general anesthetic it's very specific, these things turn consciousness off." [Roger Penrose, ref ]. Picture created by the author using GenAI

bottom of page