"Information" in Big History

<< First  < Prev   1   2   3   Next >  Last >> 
  • Monday, August 22, 2016 3:45 PM
    Message # 4204470

    It seems that many, if not most in IBHA believe that the permeating thread through Big History is the evolution of evermore complex structures and processes – and I concur.  Eric Chaisson, in his book “Cosmic Evolution,” goes on to define complexity in one of two operational ways:   1. “ as a measure of  the information needed to describe a system’s structure and function,” or 2. “as a measure of the rate of energy flowing through a system of given mass.” (p.13) After nicely describing the apparent relationship between information and the 2nd law of thermodynamics (or entropy), Dr. Chaisson later decides to discard using information as a quantitative way to measure complexity. To quote: “The conceptual idea of information has been useful, qualitatively and heuristically, as aid to appreciate the growth of order and structure in the Universe, but this term  is too vague and subjective to use in quantifying (emphasis added) a specific, empirical metric describing a whole range of real word systems.” (p.132) He further states, “after matter and energy – for, to us, information basically is a form of energy whether flowing, stored, or unrealized.” (p.133)

    I agree that at this time, energy flow rates are probably the best quantifiable correlates that we can calculate to obtain a rough measure of complexity. While energy flows are undoubtedly necessary to create and sustain complexity, I disagree that information is a form of energy. For example, it is difficult for me to understand how bear claw marks on a tree or notes in a hymnal – forms of information to other bears or humans at least – constitutes “energy” even if energy was required to create the marks. Also, as his book elucidates, physicists agree that “Maxwell’s demon” of thermodynamics fame was finally slain by the proof that the demon requires energy to erase information but not necessarily to record it.  Finally, highly respected mathematicians and physicists as diverse as John Wheeler, John von Neumann, and Stephen Hawking all seemed to posit that information is a physical property on to itself, e.g. see Hawking’s famous concession that black holes preserve the information they “absorb” (e.g. a star’s structure) on its event horizon.

    Also, I doubt that energy flow rates are essentially what define complexity; rather it is a necessary but insufficient ingredient.  For example, even a resting hummingbird uses the same or a little more energy than the human brain (about 0.001 watts/gm/sec) but I doubt that anyone would contend that a hummingbird is as complex as the human brain, regardless of how you define complexity. 

    Admittedly, there is no consensus on how to define complexity even amongst those who make their living studying the phenomenon.  I would agree with Dr. Chaisson, however, that one of the most intriguing definitions is “the amount of information needed to describe a system’s structure and processes.”  Furthermore, while energy flow rate correlations to complexity are interesting;  a system’s informational content, transference, storage, and processing are even more interesting.  Hence, “information” is a topic worth discussing and understanding because it constitutes one of the essential threads that run through the entirety of Big History and seems to be inextricably entwined with complexity.

    Of course, even if we agree with Dr. Chaisson that information is vague, qualitative, and heuristic (and I do), we still must come to some conclusion as to what defines “information,” or its general nature, even if it is an operant like “energy” = the capacity to do work. (i.e. the fundamental nature of energy is not understood either!)

    My own proposed definition of “information” is: relationships between entities in space-time.  I would very much like to hear what Big historians opine.


    Moved from IBHA Discussions: Thursday, July 13, 2017 11:58 PM
  • Tuesday, August 23, 2016 6:05 AM
    Reply # 4205331 on 4204470
    Lowell Gustafson (Administrator)

    Thanks Ken.  The effort to define and explain complexity is at the heart of our new field.  Your point about complexity being made possible by energy flows that permit increasingly complex relationships among units understood as the exchange of information merits attention.

  • Tuesday, August 23, 2016 6:37 PM
    Reply # 4206618 on 4204470
    Anonymous

    Let us for the moment separate ‘information’ from ‘complexity’, in hopes of later reuniting them.

    Discussions of “information theory’ to which I have paid most attention are those which relate the Boltzmann//Gibbs/Shannon (BGS) formulae designed to distinguish random from non-random states.

    For one example, Bob Ulanowicz, in  the application of info theory to ecosystems, in his book ‘Ascendancy’, used the idea of  ‘information’ as a factor which caused, or was associated with, departures from randomness in the relationships between interacting elements in an ecosystem. He examined nutrient flows, and was able to use the BGS formulae to define degrees of organization in such systems. Bob did not develop a full description of complexity in such systems, and I did not entirely agree with every use of the word complexity in his discussion.  But he did identify levels of ‘mutual information -- non random flows  between nodes -- and ranked such systems in terms of ‘average mutual information’. 

    In his 1994 paper on ‘relational quantum mechanics’, Carlo Rovelli characterized relationships between quantum systems as the basic building block of order in the U. Those relationships were characterized in terms of correlations -- or departures from randomness --in each system’s encounter with -- or ‘measurement’ of  -- any other quantum system. And Rovelli used the BGS approach to defining how information -- correlation -- would be involved in such quantum interactions. 

    From there you get -- via some filling-in steps -- to the question of how many distinguishable relational states are created and maintained in a definable, distinguishable system -- like a rock, or a galaxy, or a planet, or a living thing.

    Here is where Chaisson enters in. He has connected the rates of energy flow through things like stars, galaxies, planets, and living things.  He can point out the obvious differences in the apparent number of distinguishable states  in such systems, without precisely quantifying them, and connect that with the rates of energy flow. And thus he can talk about ‘complexity’ and its connection with free energy density flow, without precisely defining complexity. 

    So if you follow the trajectory of Chaisson’s thoughts -- which are better than any others I have found and are extensively documented -- you are led to defining complexity as the quantity of distinguishable relationals states per unit of time per unit of mass. And this will  be proportional to  free energy density per unit of time and mass.

    And, nicely, we are now linking relational theories of order construction in the U with non-equilibrium thermodynamics -- energy flows and structures. 

    So here we are. But how do we identify and count the number of distinguishable relational states in a system?

    Here is where my own thoughts tend to dwell at the moment,  But I am not at all satisfied with the combination of intimations, intuitions, and fragments of coherence now extant in my thinking processes. And if I am not satisfied, I could not  expect you to be. 

    However, here are exploratory thoughts.  

    I am led to go from benard cells to jet engines to hierarchical levels of living creatures,  to focus on some aspects of complexity -- indeed, tiered levels of complexity. 

    As you know, in benard cells, with a layer of fluid and a heat differential, bottom to top in a gravitational well, the fluid organizes in hexagonal columns.  It creates order in dissipating the differential. Inside each column is a lot of random motion. But the column  correlational structure makes a layer of complexity -- just two levels -- randomness within columnar organization -- but high energy rate density. 

    Take a jet engine. Whirling fans, nacelle, organized pattern of combustion and exhaust. High energy density, some layers of order containing. 

    Now to a cell in an organism. After a few billions of years of trial and error, a lot of differentiated structures within a membrane. The membrane imposes some order at its level, which provides for sustainable levels of high probability for the highly energetic interactions within. 

    Now have the cell level, over evolutionary time, get organized -- non-randomized -- into relatively simple multicellular creatures. Again, borders on the MC creature, containing first relatively simple but over evolutionary time more complex internat organization -- patterned non random energetic processes, all in a containing level of order . 

    Now have the MC creatures over time evolve into ‘social’ creatures, with first simple to greater interactions between them. Social structures become ‘complex’ -- they have distinguishable subsystems, and energy flows through them. 

    So now we have tiered complexity. That is, we have tiered levels of distinguishable relational states at each level. We have differences in complexity -- amounts of distinguishable relational states, or systems -- at different levels. And we have high energy density in the system as a whole

    So now how do we sum, in complexity terms, these tiers, or nestings,  of distinguishable relational sets? 

    Also, at the edge of the thought process, one speculates on the relationship of mass to these concentrations of tiered complexity -- tiered organized energy flows. 

    And of course you can take the tiering of energy density down to the atomic and molecular levels.

    Welcome to this thought process. I hope it stimulates some curiosity and some thoughts or suggestions.


    Jack 








  • Wednesday, August 24, 2016 1:12 PM
    Reply # 4208331 on 4204470

    Jack and I agree on the fundamentals here. First, as I will discuss below, we need to better come to agreement as to what "information" and its variations are before positing it as being the core constituent that gives rise to and defines "complexity." After all, Dr. Chaisson does make a good case in regards to its vagueness.

    Second, I agree with Jack and others I've heard, that increasing complexity is due the "nesting" of subsystems upon subsystems to create a greater complex system like a Matryoshka doll, i.e. quarks beget hadrons, begets nuclei, begets atoms, begets stars . . .   Melanie Mitchell's book, "Complexity - a Guided Tour," offers 9 different analyses of what defines complexity.  The last one she describes is "degree of hierarchy" and is essentially the same as what we are proposing. 

    I would like to back up at this point, however, and see if we agree that "information" is equivalent to "relationships," which is essentially the same as Jack's "correlations."  This makes sense if we agree that Shannon et al demonstrated that information is the flip side of entropy (disorder). Information or order seems contingent on their being relationships between entities. My justification for also stipulating information = relationships of entities in space-time is because it isn’t certain as to what happens when information is absorbed into a black hole – which arguably resides outside of known space-time. Other people who have studied information including Benjamin Schumacher (John Wheeler’s last grad student and an expert in quantum information at Kenyon College), and Terrence Deacon (neuro-anthropologist at Berkley) have also stated that they believe that information is fundamentally concerned with relationships – so, we have good company here.

    As a helpful exercise, it is good to see if the definition holds in different situations, especially at the extremes.  For example, before the Big Bang, we could assert that in terms of Shannon bits, there was no information or just a “0.”  At the Big Bang and during the intense radiation phase, we can add a “1” because now there is something, even it was just an amorphous soup of radiation without discernible relationships, (okay, there might have been other universes, but we’ll keep the model simple). As the universe expanded and cooled relationships between entities began to form.  The very simplest relationship, which might not ever have existed, would be the appearance and location of the first hadron which we can designate as being at (x,y,z,t) with appropriate conversions of each axis to 1’s and 0’s if you prefer, plus a bit for its existence. (note, in a universe with no boundries, we might need to posit the existence of a second particle so that a location is relative to something). At the opposite end of the cosmic historical timeline, we have the possibility of “heat death.” If thermal equilibrium occurs, there would be no available energy differentials for new relationships or processes to occur. Instead, whatever cold structures remain, held together by the residual forces of nature, would be static and even locational relationships would not change, i.e. “11010010101101110000000000111” would remain “11010010101101110000000000111. ” (Note: I am not a cosmologist and I might have the details of “heat death” wrong.) Other mental exercises are possible as well to try and test the proposed definition.

    It’s also critically important to make the distinction between syntactical information and semantical information.  A distinction also echoed by Terrence Deacon, but missed or ignored by Dr. Chaisson. I will leave the discussion at this time with these thoughts: Syntactical information is about relational forms and processes that can often be described and even predicted by science and mathematics. For example, particle physicists like Paul Dirac and Peter Higgs mathematically predicted the existence and properties of anti-matter and the Higgs Boson, respectively, before they were even experimentally confirmed to exist.   In astronomy, Robert Oppenheimer predicted the existence of “black holes” from the mathematics of Einstein’s theory of general relativity.  Shannon was also concerned about syntactical information when asked to calculate how much data could be transmitted over a phone line. He determined that bits are the basic currency of syntactical information. Semantics was not relevant to his assigned task.

    Ironically, we have the least math and physics to describe and predict that with which we are closest and most familiar: “life.”   Life begins to worry about semantical information, or information with “purpose” (e.g. bacteria senses environment with a pH that is too low for its persistence, so moves away) and at some point in the evolution of a nervous system, “meaning.”  It seems that semantic information can eventually emerge from syntactical information. How this can happen remains a deep mystery.


  • Thursday, August 25, 2016 10:29 PM
    Reply # 4211291 on 4204470
    Anonymous

    I think I can agree that 'information' = 'relationship', but want to sit and marinate a bit on that. Both are bit-created. One cannot get relationships without quantum encounters, which as Rovelli structured things, were done by information sampling, or encounter in familiar BGS format. 

    As to syntactic vs semantic info, I doubt I would now use these two words, but I can see a distinction lurking behind the words, so to speak. 

    If we consider the intersubjective social construction of a consistent universe, in what we call scientific terms, I could equate that with 'syntactic'. 

    If we equate 'semantic' with the experience of any given quantum system with another, such that the other has 'meaning' in terms of effect on the chosen observer, or participant, so to speak, then I can see the distinction in these terms. 

    This fits into how systems and 'wholes', work. S1 has components, as does S2. What S2 sees of S1 is the relational effect of the 'whole' S1, not the components. This is the way hierarchical order works. 

     S2 'sees' of S1 only the relational encounter, in S2's terms. S2 does not necessarily register teh inner workings of S1 or its place in the whole network of things. 

    'In complex verbal creatures, does S1 help me, or hurt me; warm me or cool me? Is S1 sharp or soft, to me, etc.

    If this 'semantic', then I can see the distinction it seems Ken is making.

    Jack


     

  • Saturday, August 27, 2016 9:29 PM
    Reply # 4213947 on 4204470
    Anonymous

    A review of the variety of treatments of ‘information’, in wikipedia, leads me to understand better why some have felt the concept of ‘information’ is too ill defined and multifarious to be treated with productively, but also to have more persuasion that Ken and I are on the right track, so to speak, in equating, or relating, information to relationships. Credit to Ken for the boldness to make the suggestion. 

    Of course, humans are notorious for confirmation bias, and mutually reaffirming fantasies. But some simple and I hope elegant logic points in this direction.

    Let us take randomness as a starting point. ( I will shortly give a little word picture to illustrate what that ‘means’).  In a completely random situation (could such a situation have what we see as ‘existence’), one could not associate  anything with anything, or for that matter nothing with nothing. 

    But let us now introduce a correlation between elements. Now if you encounter element A, you have a better probability of encountering B. 

    This correlation is a relationship. “Order ‘ is made up of relationships. Order is made up of realized probabilities. 

    So when we encounter A and B, we are ‘informed’. We can assign a probability other than randomness. 

    At the same time, when A and B are correlated, they are informed’ by that act of correlation. When A encounters anything other than randomly, it encounters B, and vica versa. 

    And so up to a crystalline lattice, and many other things. 

    This is why I like to keep focus on BGS math, or analysis. Because it is an elegant way of conceptualizing states of randomness, the mathematical possibilities of correlation or non correlation, and the amount of correlation, or order, or probability assignment, in a given state set.

    And by the way, that is why the ‘power law’ distributions are ubiquitous in the U: they arise from the characteristics of correlational processes.


    --------  


    A brief digression. The quantum physical acts humans engage in can make easier to understand why subjective, or encounter,  approaches to probability assignment often get related to info theory, in a Bayesian oriented discussion. 

    Such a discussion may presume that an entity makes probability assignments, and alters them in an encounter based process. 

    So we can say that S1 can encounter S2 only probabilistically and often partially. So S1 can attempt a probability assignment as to S2, and to characteristics of S2, and the correlations of S2 with S….

    And so we have Bayesian probability assignments

    -------- 


    Now to the little thought experiment/illustration.

    Suppose there is a totally random universe. (RU) 

    There cannot be an RU. There would be no universe as we understand the universe, as the ‘tangible’ universe is made up of correlations, and thus information. But let us make this supposition.

    We try to interject Oscarina the Observer into RU. (We have to interject Oscarina, because Oscarina is an intricate set of relational structures, and that cannot exist in RU. But Oscarina is willful, and insists on doing this. )

    What does Oscarina observe?  

    Nothing. The atoms are too small to be seen by a huge correlation of atoms, and they have no regularity between themselves in their various  impingement on Oscarina. 

    There are no suns, no planets, no galaxies, no ‘voids’. 

    There is no ‘information’ given Oscarina, other than perhaps temperature from the atoms encountering the correlated atoms which make up Oscarina’s integument. 

    And that, we might note, is an observation made by Oscarina as to Oscarina, in Oscarina’s terms, who cannot be there in the first place. 


    A little Saturday night fun. 


    Jack 


     


  • Sunday, August 28, 2016 3:01 PM
    Reply # 4214842 on 4204470

    I need to back up a little about the fate of information in the scenario where the universe possibly experiences heat death in the incredibly distant future. After some reading, it appears that some cosmologists believe that matter will eventually decompose into a scattering of photons and leptons. Hence, the informational content of the cosmos would be even more minimized than if cold planets and star remnants persisted. Again, there would be no energy differentials to make new structures or process possible, and, hence, no creation of information as well.

    In regards to syntactic vs semantic information, this is a topic that warrants a much longer dissertation than possible here. I need to read Luciano Floridi's dissertation on this topic as he is one of the major philosophers in information. My own current proposal is that  syntactical information is about  relational patterns and processes of both abiotic and biotic structures or systems.  Mathematics is the concise and precise language that we use to describe these patterns and processes, whether it is arthimetic, topology, or the math used to describe the chromodynamics of quarks. Syntactical information can also be "natural" as the information embodied in a snowflake or star's structure. It can be "artificial" as when a bear makes claw marks on a tree, or one to two lamps are lit in the Church tower for Paul Revere. 

    Semantic information, on the other hand, requires an "agent" of some sort (e.g. a bear, Paul Revere) to detect and process (apprehend) the syntactical information into a purpose or meaning. If you are familiar with John Searle's (philosopher of the mind) "Chinese Room" argument, he makes that distinction as well to argue that computers even in principle cannot achieve true artificial intelligence as currently conceived because they are fundamentally and operationally syntactical - can only manipulate symbols).  At this time, science cannot explain how life or subjective experience (loaded with "meaning") occurs, and science is also berefit of any mathematics to describe the relationships that make an life, agency, subjective experience, or semantics possible.  

    If my reading of Floridi's thoughts on semantics changes my thoughts about semantical information substantially, I'll try and weigh in again

  • Thursday, September 01, 2016 9:48 AM
    Reply # 4222023 on 4204470

    Apparently complexity begets complexity!! I’m doing my best to keep up!! It's hard!! But I largely have two points of connection to this discussion, the first lies in the opening offering; the endorsement of Eric Chaison’s suggestion that the cosmos is progressing to greater degrees of complexity. The second involves those two, so far characterless charmers, S1 and S2 and their antics (well worthy of Dr Zeus)


    As to the first point, there is no doubt that, looked at in one way, the cosmos is progressing towards a greater degree of complexity, but what is its most complex offering to date? I would venture to suggest that the human is the most complex offering of which we know. But the question arrises; is progress to complexity the fundamental narrative of the cosmos (if such a notion has any currency)? Or is it an actual consequence of another narrative? What else is there to see? What else is there to consider? It is no surprise that a scientist would see complexity. That is to a large extent what scientist are inclined to do, but what else is there to see? Is science the only valid modality by which we engage these questions?


    The fundamental questions that underscore such discussions in a big history context are: what is the cosmic narrative that gave rise to us? How will the cosmic narrative that gave rise to us progress from here (if it is to progress at all)? and thirdly, in what way or ways will we the humans actively (or otherwise) participate in that progress? Is the process of our becoming aware of big history an essential prerequisite to our continued participation in its unfolding narrative?? Is the concept of narrative applicable at all?


    In this context it is interesting that we can choose perspectives and choose disciplines within which to accommodate those perspectives. The very discussion that we are having here could be seen in terms of the cosmic narrative that gave rise to us in the first place (and by extension “it”)


    Jack, you talk about S1 looking at S2 and seeing just a unity of presentation (as opposed to the complex “manyness” of composition)  This is very interesting and may point to another possible cosmic narrative of which the imperative towards increased complexity is part (if that is what we choose to see).  The fundamental narrative of science has involved to a large extent a process of trying to identify the entities of which things are composed and how they relate to one another. This has lead to the search for fundamental entities which can not be further dismantled. Bigger and bigger experiments to find smaller and smaller things. It is interesting to think of the opposite to this. To consider an entity that is such that it will not constitute an operational entity in respect of something else. We may look to S1 and we can ask two things; firstly, what is S1 composed of and secondly, what kind of a “Thing” will S1be a compositional element of? We know that we are composed of cells and unless the cosmic narrative that has given rise to us, the humans has come to a conclusion in us, we may well be cell-like to future entities, if cosmic precedent is anything to go by. We can consider that this is already happening.  You suggest Ken that there is no mathematics to address the living reality. There is actually a very simple one. 


    We casually use these terms, like “entity”, “thing”,  “element”,  “unit” etc. We could categorise them under the tag “Unit” and we can say that one of the things that gives rise to a distinction in humanity (with respect to other living things) is a capacity to realise the “One”. This has an odd biblical ring to it which is not altogether out of place, but for now I literally mean the number “one”, the unit. This involves the ability to perceive boundaries; firstly in two dimensions; to conceive of them in three dimensions and to attribute a nomenclature accordingly. We can not actually see the entire physical unity of anything. In any given situation we only have a concept of such based  on a conceptual assembly of experiences. 

    We know that this capacity to perceive well defined unity has a limited applicability in the cosmos at large, but the cosmic narrative that gave rise to us has produced that capacity in us and may actually be progressing through that capacity.


     The essence of this system of nomenclature involves the attribution of collective terms; hydrogen atom, atom, molecule, cell, multi celled organism, human etc. these are the entities that stock the narrative that gave rise to us over a 14.5 billion period of cosmic evolution. If we apply a collective term to all these entities, and the one I have coined is the term “priunit” we notice a very simple but inescapable fact. As the cosmos progresses, the total number of priunits in it is decreasing. This imperative (the one that has given rise to us) must in time lead to a count of “one”. The cosmic narrative that has given rise to us is progressing, perhaps through our capacity to both realise and articulate it, to a state of “oneness”. (the correlation with some religious narratives might be interesting if not contentious!!) If we consider our old friend S1 retrospectively in terms of that narrative we encounter complexity. If we consider he/she prospectively we see that the cosmic narrative that gave rise to us is progressing to simplicity. Read more at www.priunit.ie.


  • Friday, September 02, 2016 4:35 AM
    Reply # 4223553 on 4204470
    Anonymous

    Welcome to the conversation, S3.

    Much of your strong set of intuitions can be considered in the light of hierarchy theory, or to use everyday terms, the way the universe builds tiers, or nests, of wholes out of elements.

    In a way, this does reduce the number of interacting units in the U, relative to the total number of interactions which would arguably exist were it not for tiered unifications. And in the process concentrates energy flows, leading to high energy densities.

    But paradoxically to human thinking, this progressive unification creates interactions -- read complexity -- which would not exist but for the unification process. 

    Restriction, restraints, correlation, produces order, the universe, and the Second Law -- which some see as running down the universe -- as well.

    And if there were a deity, would not she/it delight in all the perspective play and conundra visited on these little mite- creations?

    Jack

  • Monday, September 05, 2016 7:30 AM
    Reply # 4229020 on 4204470
    Anonymous

    Jack H, or S3, I did not mean to dismiss or demean your perceptions by  referring to them as merely intuitions -- though much of 'science' is prompted and guided by intuitions.  I have looked into your website, and have spent a more time on the questions you present in your post. 

    I will not attempt to address all your copious observations and suggestions -- just a couple.

    At present, it would seem all the U's probabilistic realizations of complex states are done in its vast, decentralized set of local evolutions. So if we ask ourselves, with good reason, what unification may we humans identify with, now, here, in our lovely island of complexity in the great cold space surrounding us, beyond tribe, nation and even global humanity, it would be the realization of the wonderful, non equilibrium thermodynamic ordering process on this solitary, unitary  Earth.

    In a manuscript which I am in the process of assembling for a publisher (wish me luck, please) I suggest that the better to avoid our own civilizational collapse, or withering, or stunting, we need to stop asking so narrowly what Earth will do for us today, and ask how we best fit into the realization of the potentials of Earthlife. If we make our global niche into the entrainment of life's energy and ordering potentials -- and by the way its information embodiment potentials -- we may better serve our own survival and efflorescence. 

    Enough for now. On to Holiday. 

    Jack P


<< First  < Prev   1   2   3   Next >  Last >>