Information Dynamics Glossary: G>M

Glossary G>M: definitions, descriptions and links of the diverse variety of terms employed throughout the multiple articles devoted to Information Dynamics.

Gödel

In the early part of the 20th century, a cash prize was offered to anyone who could employ the rules of logic to prove that an underlying proposition was undeniably true. In essence, they wanted to prove that logic can determine truth. This notion was in line with the concept that absolute truth exists. Once an absolute truth was established, a logical structure could be constructed that would be correct in all situations.

Gödel, instead, turned the notion of logical certainty inside out. Employing the traditional tools of logic, Gödel proved that if a system is logically consistent that it can't be complete, and that if a system is complete that it must contain paradox. In other words, a logically consistent system can only apply to part of the puzzle, not the entire picture. For a logical system to encompass the entire puzzle, it must contain some inconsistencies.

We applied this reasoning to the world of matter. Mechanics is able to accurately describe the atomic world and larger without any logical inconsistencies. However, to describe the subatomic world, the uncertainty of Probability must be included. More importantly electrons and photons can be viewed as either a wave or particle depending upon the perspective of the observor. These two truths are mutually exclusive logically. This is the paradox that completes the system.

Information Digestion Model

The conceptual model that was developed to make sense of the synergy between the mathematics of Information Dynamics and empirical data.

The core assumption of this model is that living matter has a material and an immaterial component. The immaterial component is based upon the digestion of information, hence the name of the model. The Living Algorithm provides the method of digesting information.

This unique perspective is based upon a simple insight. Just as the Body digests food, the Mind digests information. In both cases, the digestion process transforms environmental substance into a form that Life can employ to fulfill potentials. Just as the body's digestive system extracts food from organic material, the information digestion process extracts meaning from digital information. In brief, living systems convert both food into biological energy and information into meaningful knowledge.

Both Body and Mind require downtime to complete the digestion process. When the digestion process is completed, Body becomes hungry for food and Mind becomes curious for information. Just as Body obtains food through the mouth, Mind obtains information through Attention.

To understand why the synergy between the Math and the Data 'forced' us to make these assumptions, read Core Postulates.

Information Dynamics

Information Dynamics is a mathematical system based around a single procedure, the Living Algorithm. This algorithm digests data streams to reveal their rates of change, i.e. derivatives. As such, the process generates the dynamics of information, as the title suggests.

The author noticed multiple patterns of correspondence between the behavior of living system and the mathematics of Information Dynamics. To make sense of these correspondences, he was 'forced' to make some assumptions – the Core Postulates. He then developed a theory around these assumptions. The theory of Information Dynamics is based around the notion that living systems employ the Living Algorithm to digest information in order to give it meaning. In this sense, Info Dynamics is both a mathematical system and a theoretical system regarding living behavior.

This unique perspective is based upon a simple insight. Just as the Body digests food, the Mind digests information. In both cases, the digestion process transforms environmental substance into a form that Life can employ to fulfill potentials. Just as the body's digestive system extracts food from organic material, the information digestion process extracts meaning from digital information. In brief, living systems convert both food into biological energy and information into meaningful knowledge.

A distinct and unique feature of the digestion process is choice. We choose what to eat and when to cease eating. Similarly Mind is also able to choose which information to digest, i.e. where to focus Attention, and when to cease digesting information, i.e. Sleep. In this fashion, this fresh perspective introduces both meaning and choice into the scientific dialogue. The Information Digestion Model provides a more complete articulation of this relationship.

Information digestion and information processing are worlds apart. Computers process information; living systems digest digital information to give it meaning. Put another way, processed information has no meaning until Mind digests it. (For more on this topic, read the glossary entry: info-processing vs. info-digesting.)

Info-Dynamics theory provides a plausible explanation for many classes of experimentally verified behavioral phenomena that have long mystified the scientific community. These include, but are not confined to, the harm of Sleep Deprivation, the Necessity of Sleep, the harm of Interruptions to a productive session, the attention span behind the 10-minute rule, Dement's 'Opponent Process' Model for the Biology of Sleep, and Posner's Attention Model. Neither the sophisticated matter-based theoretical structure of the hard scientists, nor the complex information processing of the computer scientists have been able to provide any insight into these basic and common human behavioral phenomena.

For the complete list of Notebooks associated with Information Dynamics, check out the Volume List.

information digestion vs. processing

The Living Algorithm computes the ongoing rates of change, i.e. derivatives, of data streams. We refer to this process as information digestion. This self-referential, i.e. recursive, process generates a mathematical system deemed Information Dynamics. The reflexive process connects the past to the present, thereby providing history and context to the information. In this way, the dynamics of information reveals the patterns that lie at the heart of living meaning.

Information digestion transforms digital information into a meaningful form that can be useful to living systems. Similarly we digest organic substances to turn it into a form that nourishes the body. The innate forms, i.e. energy patterns, of information digestion provide insight into many common human phenomena that have long mystified the scientific community, for instance sleep.

In order to reveal the dynamic meaning behind data streams, the information digestion process transforms 1-dimensional instants, i.e. the data, into 2 dimensional moments, i.e. the cloud of derivatives. In so doing, the transformation process sacrifices precision for meaning. In contrast, information processing deliberately sacrifices meaning for precision. In parallel fashion, the Heisenberg Uncertainty Principle states that we can know either the position or the dynamics of the subatomics, but not both.

Processed information must be digested by living systems to reveal meaning. Otherwise it has no dynamics, no history, and no context - an inert CD versus the musical response. These characteristics are intentional. Instead of meaning, exact duplication, hence precision, is a prime function of information processing.

How is this extreme accuracy achieved?

In the early 1950s, Shannon had some major insights into information. His mathematical proofs exhibited that internal redundancy can be employed to insure the accuracy of any information flow, electronic or otherwise. This insight enabled engineers to transform our modern world into an electronic wonderland of international transmission.

In order to come up with his new way of understanding, Shannon stripped information to its bare bones. Instead of consisting of words, sentences and paragraphs, he viewed information ultimately as a string of austere and isolated 1s and 0s. By ignoring words and sentences in his investigation, Shannon consciously eliminated meaning from information to better comprehend its essence. By understanding essence, information theory was able to ensure accuracy of transmission, the ultimate goal. Of course, this perspective is at the heart of our computerized digital world.

Computers are based in information processing. A string of 1s and 0s morphs into another string of 1s and 0s. All electronic communication consists solely of this digital information. Despite the seemingly miraculous applications of this perspective, it is unable to provide any insight into many common phenomena of the human world. In contrast, these same 'mysterious' phenomena are better understood under the assumption that humans digest information via the Living Algorithm. Check out the growing list of empirical phenomena that exhibit distinct patterns of correspondence with the mathematical behavior of Information Dynamics.

Interruption phenomenon

The behaviorial phenomenon: Interruptions have negative impact upon a productive session that is disproportionate to their duration.

Scientific studies have demonstrated that interruptions to a productive session have a negative effect upon cognitive performance. This adverse impact is disproportionate to the size of the interruption. We also have cultural evidence of this phenomenon. Theater owners frequently lock the doors at the beginning of a performance to prevent interruptions to the audience's undivided attention. Coaches call time outs to interrupt a player's or team's momentum in the hopes of impairing the performance. This disproportionate negative impact of interruptions on performance is the individual empirical meaning of the Interruption Phenomenon.

Information Dynamics provides a mathematical model that behaves in similar fashion. Interrupting the Living Algorithm’s Pulse has a disproportionate negative impact upon the ideal dimensions of the mathematical pulse. This distinct pattern of correspondence between the Living Algorithm's mathematical behavior and the empirical evidence regarding interruptions to a productive session constitutes the general meaning of the Interruption phenomenon.

As such, the Interruption phenomenon symbolizes two levels simultaneously: the individual empirical event and the general linkage between the mathematical and the empirical. For a more detailed analysis of this phenomenon, check out The Pulse of Attention Interrupted.

The Information Digestion Model was developed to make sense of the correspondences between the mathematics of the Living Algorithm and many features of empirical reality, including the sleep necessity phenomenon. To understand why the Math/Data synergy 'forced' us to make certain assumptions, read Core Postulates.

Liminals

The Living Algorithm process reveals the ongoing rates of change/derivatives of any data stream. Liminals is the name of the higher derivatives as a group. Below is a visualization of the Liminals of the Triple Pulse data stream.

Triple Pulse (green) & the Liminals

The green curve is the data stream's 2nd derivative, the Directional. The other curves are visualizations of the data stream's 3rd, 4th, 5th, 6th and 7th derivatives, the Liminals.

Why the name? Definition of the word 'liminal': "Relating to or at the threshold, entrance or beginning." In contrast, subliminal means beneath or under the threshold, while superliminal means above this entrance.

According to Information Dynamics theory, the zero line in the graphs divides conscious awareness from the subconscious (the former above the line, the latter below). The positive Active Pulse, which is generated by conscious Attention, is above the line (superliminal). The Rest Pulse, generated when there is a lack of Attention (sleep), is below the line (subliminal). As the rest of the curves in the Creative Pulse family wander around this threshold between consciousness and the subliminal, we've named them Liminals. According to Info Dynamics theory, they are related to digesting information. If they are not allowed to zero out (fully digest the information), the undigested information can interfere with the potentials of the following cycles.

For a more complete analysis, check out the article Liminals & Unconscious Cognition.

Living Algorithm (a.k.a. the Cell Equation)

Basic Living Algorithm

General Living Algorithm

The Living Algorithm's digestive process reveals the rates of change (the derivatives) of any the data stream. The basic Living Algorithm computes the Living Average (the data stream's 1st derivative). The general Living Algorithm (shown below right) generates a system of dynamics that is the foundation of Information Dynamics.

The algorithm for computing the Living Algorithm is a simple procedure.

• 1) Determine the difference between the most recent data point and the existing Data Stream Derivative.
• 2) Scale this difference proportionately.
• 3) Add or subtract the scaled difference from the existing Data Stream Derivative. We add if the data point is larger than the existing Data Stream Derivative; and we subtract if the data point is smaller than the Data Stream Derivative.
• 4) The result of this process is the new Data Stream Derivative.

The mathematics of the Living Algorithm has many features in common with living systems. 1) Past Awareness. Both Life and the Living Algorithm are self-referential feedback systems. Living systems have an awareness (whether conscious or not) of the past experience of the organism and the capacity for behavior that reflects that awareness. 2) Context Awareness. They are both sensitive to the present situation – a context sensitivity that incorporates the power to monitor and adjust. 3) Regenerative. They both require ongoing data input to sustain the momentum of a pattern, as the potential impact of the data on the ongoing average decays over time. 4) Composite Memory. They both store memory as a synthesis of overlaying images, rather than exact replicas of events. 5) Fungible. They both must transform the precise details of data into averages in order to reveal patterns. 6) Recency Effect. While the potential impact of each data point decays over time, the most recent data points in the stream have the greatest potential impact on the Living Algorithm’s ongoing derivatives. Similarly, the most recent events have the greatest potential impact on an organism’s current perceptions. Because both systems share these 6 characteristics, we argue that the Living Algorithm is a powerful model of living systems.

The Living Algorithm produces the derivatives of the data stream. These derivatives act as powerful descriptive predictors. As an indication of the ability, the 3 basic derivatives are called the Predictive Cloud.

Finally the Living Algorithm is the ultimate in elegant and computational simplicity. Nothing more than basic arithmetic skills are required. Due to these three features (similarity with living systems, powerful predictors, and simplicity) we argue that the Living Algorithm could be one of Life's primary algorithms – a computational recipe for digesting data. In fact, this is a fundamental assumption of Information Dynamics. (Check out the article Living Algorithm for more details.)

mathematics of behavior

This term encompasses the entirety of the author's study of living behavior. The underlying theme of this line of research is that mathematical mechanisms are a significant factor determining human behavior. Further it is possible to consciously tune into these naturally occuring rhythms to maximize life's potentials. Conversely ignoring these rhythms has a negative impact, minimizing what is possible.

The study of the Mathematics of Behavior consists of, at least, 4 branches or projects: Data Stream Momentum, Root Beings, Energy Bundles, & the Creative Pulse. Information Dynamics only applies to the Data Stream Momentum and Creative Pulse Projects.

Mathematics of the Moment

Mathematics of the Moment is another name for the series of volumes written by the Author regarding the connections between Information Dynamics and human behavior. Triple Pulse Studies, the 1st volume, lays the foundation for the study by illustrating some striking correspondences between sleep-related behavior and the Living Algorithm's Triple Pulse. The Living Algorithm System, the 2nd volume, establishes the plausibility that Life employs the Living Algorithm to digest information. Data Stream Dynamics, the 3rd volume, develops the notion that Information Dynamics shares a common dynamical structure with classical mechanics. Further, this dynamical structure provides the causal mechanism behind the aforementioned correspondences. Even more intriguing, the Author develops the notion that the energy density of a data stream determines how mental energy is converted into physical energy. Attention & the Living Algorithm, the 4th volume, illustrates the mechanism that ultimately transforms the mental energy of Attention into the information energy of a data stream, which in turn translates into physical energy when the conditions are right. In essence, this volume establishes how Mind influences Body. Check out Information Dynamics: Volume List for links to the articles.

maven

A Jewish word for someone who is impelled to assist others by pointing out good monetary deals on goods and services due to extensive information gathering. In The Tipping Point, Malcolm Gladwell uses the word to indicate a personality type that is essential to the spreading of social trends. They are characterized as those who want to help others out by disseminating useful information that they have accumulated with no reward in mind. In similar fashion, the information in this book is maven-like, in the sense that it is a good deal on personal energy use with negligible cost.

Dr. John Medina, author of brain rules

Dr. John Medina is a developmental molecular biologist. He holds joint affiliate faculty appointments at Seattle Pacific University, where he is the director of the Brain Center for Applied Learning Research, and at the University of Washington School of Medicine, in its Department of Bioengineering. He has won numerous prestigious awards for his teaching skills. Employing a cross-disciplinary perspective to critique the works of his colleagues is one of Dr. Medina’s specialties. His interdisciplinary approach places him in an ideal position to summarize the research coming from a number of sub-disciplines within the broad field of cognitive science.

In his book, brain rules, Dr. Medina summarizes mainstream scientific discoveries in the emerging field of cognitive science. He catalogues the findings from diverse disciplines – ranging from neural science to biology to psychology among others. Dr. Medina also makes useful behavioral suggestions regarding these important scientific discoveries. The evidence for this brain research is well established. Dr. Medina states: "The supporting research for each of my points must first be published in a peer-reviewed journal and then successfully replicated. Many of the studies have been replicated dozens of times." (brain rules, p. 5) The prestige of the author makes his book, brain rules, an ideal source to transmit the current state of cognitive science to the educated public. Further, it is evident that the findings and theories contained therein are widely accepted by the contemporary scientific community.

moment/instant

The Living Algorithm's digestion process computes the rates of change/derivatives of any data stream. The data stream consists of a string of discrete data points. We choose to characterize each individual data point as an instant. Because of their relative isolation and independence from other data points, we consider instants to be one-dimensional.

The Living Algorithm’s process determines the ongoing relationship between these instants. By revealing the rates of change/derivatives, the Living Algorithm creates a context that relates the current instant (data point) to the immediately preceding instants (data points). With each iteration, i.e. repetition of the relating process, the Living Algorithm determines what we define as a moment. The data stream derivatives determine the character of each moment.

While there is a moment for each instant and moments are based upon instants (data points), they do not include the instant. Instead, moments indicate the ongoing dynamic relationship between instants. Because they indicate the dynamic relationships between instants over time, we consider moments to be 2-dimensional.

Due to their independence and isolation, instants are inherently devoid of meaningful pattern. Due to the context provided by the Living Algorithm, moments are inherently filled with mathematical ‘meaning’. In essence, the Living Algorithm transforms ‘meaningless’ instants into ‘meaningful’ moments.

monograph

"A description or systematic exposition of one thing or class of things; a dissertation or treatise written in great detail." (EB Dictionary p. 823)