The prior article developed the notion that living systems must extract meaning from ongoing data streams to survive. Because of the quantitative nature of data streams, we suggest that this meaning is mathematical in nature. This mathematical meaning has a few crucial features. The mathematics must address the immediacy of living systems as well as providing predictive descriptors for each moment. As of yet, traditional mathematical systems haven’t been able to meet this challenge.

The esteemed Dr. Zadeh, the father of Fuzzy Logic, recognizes this deficiency and offers his own mathematical system as a solution. Let’s see how successful this approach is. To set the context, our exploration begins by revisiting a previously-cited quotation from Dr. Zadeh:

“(Due to) the fundamental inadequacy of the conventional mathematics – the mathematics of precisely-defined points, functions, sets, probability measures, etc. – for coping with the analysis of biological systems … we need a radically different kind of mathematics, the mathematics of fuzzy or cloudy quantities which are not describable in terms of probability distributions.” (Dr. Bart Kosko, Fuzzy Thinking p.145 quoting from Dr. Zadeh’s paper.)

Inspired by this insight, Dr. Zadeh went on to formulate the concept of fuzzy sets. His followers turned this notion into the mathematics of fuzzy logic. Engineers have successfully applied the insights of fuzzy logic to significant real-life problems, such as how to stop bullet trains smoothly. Cognitive scientists have also employed the insights from fuzzy logic to simulate the neural networks of the brain. In essence, fuzzy logic successfully introduces the both-and approach to data sets, a complement to the either-or approach of conventional mathematics.

Zadeh’s call for ‘a radically different kind of mathematics’ implies the need for some new kind of arcane operations or esoteric measures to cope with biological systems – perhaps a biological string theory, or a calculus of living systems, or the quantum mechanics of behavior, or even fuzzy logic. However, with the exception of neural networks, fuzzy logic has not proved to be the new mathematics that is necessary for ‘coping with the analysis of biological systems’*. *

It may be that the quest to discover ‘a radically different kind of mathematics’ to analyze biological systems should not be limited to the esoteric nature of high-level theoretical mathematics. We believe it would be useful to shift the mathematical focus to the inherent nature of the subject matter at hand – the data stream. Rather than pursue ever-more complex abstractions, we believe that the intelligent application of existing mathematical tools can yield powerful, practical insights into the nature of data streams. This intelligent application of existing mathematical tools must focus on the *immediate significance of moments in the data stream*.

We have argued previously that the dynamic nature of living systems is best characterized by data streams. We went on to detail the requirements of a *mathematics of data streams* that would address Life’s immediacy. Hunting for a new set of mathematical abstractions that would fulfill the necessary data stream requirements would be a difficult, if not hopeless, quest. Data Streams offer an extraordinary challenge because of the inherently changeable nature of an open living system. Living systems can be extremely sensitive to every interaction with an environment that includes not only the closed systems of inanimate matter, but also includes interactions with other open animate systems. The complex web of interactions between living systems and the myriad data streams of existence suggests the arcane approach of theoretical mathematics is highly impractical.

Rather than a higher and more complex level of abstraction, we are looking for pragmatic and useful tools that can help us think about living data streams. Our search is for a practical mathematics. We appreciate the power of high-level abstractions. However, we believe there is a place for the use of *existing mathematical tools. *The key is to apply these tools with a sensitivity to the unique nature of data streams. The intelligent application of these tools can reveal insights into the nature of data streams, which are accessible to the informed reader. These practical insights can provide a pragmatic balance to the rarefied language of theoretical mathematics.

We begin with this question: What is the relationship between organism and data stream? In other words, what actually occurs when a biological system encounters a data stream? It is a compelling working hypothesis that organisms both ingest and digest the ongoing flow of information. If an organism is merely ingesting data, then the data stream has little, if any, relevance. Common sense tells us that processing data into some usable form must be a central purpose of an organism. The ability of organisms to monitor and adjust to the inherently changeable nature of data streams depends upon an effective digestive system. Organisms do more than merely ingest data – they digest data. The question now becomes: How does this digestive system function?

Data Streams best characterize the dynamic nature of living systems. Organisms are in a continual state of interaction with data streams. What could this process of interaction be, other than a perpetual state of ingesting and digesting the flow of information? The question is not: How do data streams behave? But rather, we ask: What method does an organism utilize to digest the flow of information? The quest is to find a method that simulates a system that digests data according to the criteria that we have established for a Data Stream Mathematics of Living Systems. This method will define an 'animate' system of information processing. The test for this system is simple. How well does it fulfill the prescribed requirements?

The requirements for the Data Stream Mathematics of Living Systems are straightforward, yet daunting. To be a successful candidate for the position, the mathematics of the 'animate' system must effectively address the immediacy of dynamic living systems. This includes: 1) weighting the elements of the data stream in proportion to their immediacy – a sort of sliding scale that weights the present moment more heavily; 2) providing descriptive measures that relate data points to each other in a manner that is sensitive to pattern recognition; and 3) providing suggestive predictors that serve a pragmatic anticipatory function. These are the requirements that a successful candidate must fulfill to be considered for the position. If the requirements are not fulfilled, the position will be left open.

We would like to recommend a candidate for this long vacant, highly coveted and esteemed position. She's an excellent choice. She is a simple form of information processing. Her sole function is to digest data streams. Further, her method of information processing generates an 'animate' system, which fulfills the requirements of data stream mathematics. Her mathematics could be called the mathematics of the moment, in that she effectively addresses Life's Immediacy. This includes providing a suggestive interpretative mechanism that articulates pattern. The name of our candidate? You may have guessed it. Drum roll please …. The *Living Algorithm's* Info System. The following discussion provides evidence that supports our claim that this ‘animate’ system, the Living Algorithm System, fits this demanding job criteria and should be considered for the position. If her qualifications interest you, read on.

The conventional mathematics of Probability, whose specialty is fixed data sets, is unable to capture the immediacy of living systems. This inability is due to a preoccupation with the average features of the entire set, rather than the unique features of particular moments. As such, we must reject this applicant for the position. The dynamic nature of living systems requires a mathematics that encompasses the immediacy of Life's data streams.

To accomplish this feat this new data stream mathematics must: 1) weight more recent data points more heavily and ) provide ongoing predictive descriptors. It is easy to explain how the innate nature of the Living Algorithm System fulfills these two preliminary requirements. The Living Algorithm’s sole function is to digest Data Streams. Ongoing raw data enters this mathematical system of information processing. The Living Algorithm Family immediately: 1) weights the most recent data points in proportion to the current moment in the data stream; and 2) transforms this external input into ongoing predictive descriptors. Accordingly, each '*moment*' in the data stream has its own predictive descriptors. (For the mathematics behind this verbal description check out *The Living Algorithm*.)

The third criterion for the position requires a mathematics that provides suggestive predictors that serve a pragmatic anticipatory function. We believe that the Living Algorithm’s descriptive measures fulfill this difficult requirement. Justifying this claim is the quest of the remainder of this article.

The Living Algorithm's sole purpose is generating the rates of change (the derivatives) of any data stream. This method of information digestion entails turning precise data (instants) into an ongoing series of moments. These moments are characterized by their derivatives. These derivatives reveal the trajectories of each moment by describing the current moment in relation to the preceding moments. Each derivative has its own unique function. The Living Average (the 1st derivative) describes the relative position of each moment in the data stream in relation to prior moments. We choose the phrase Living Average to represent a proportional weighting of moments whose impact decreases over time. The Deviation (the 2nd derivative as a scalar, an undirected quantity) describes the relative range of each moment by articulating the expected limits of the variation of pattern in the data stream. The Directional (the 2nd derivative as a vector, a directed quantity) describes the relative tendency of each moment by articulating the expected direction of the momentum of a pattern. As a trio, these descriptors characterize each individual moment in the data stream in relation to the preceding moments. In contrast, due to a preoccupation with the general features of fixed sets, Probability actually ignores the existence of these moments in the data stream. A way of visualizing this trio is shown in the following diagram.

These three descriptors simultaneously provide a prediction that amounts to rough approximations about the next data point: 1) the expected position (the dot in the center), 2) the range of variation (the circle), and 3) the direction of momentum (the arrow). Accordingly, each of the Living Algorithm's ongoing derivatives is a descriptor that contains a significant predictive feature. A simple combination of these predictive averages creates a composite predictive *cloud*. We choose the term *cloud* to represent the approximation of the expected features of the next data point, which in summary, includes position, a range of probable values and recent tendencies of direction.

The Living Algorithm generates a trio of ongoing descriptors in response to the ongoing flow of information in the data stream. These descriptors create predictive clouds. These meaningful composite elements, these predictive clouds, may be the type of predictive tools that Dr. Zadeh suggested would be necessary ‘for coping with the analysis of biological systems’. Dr. Zadeh argued that these predictive tools would be, of necessity, 'fuzzy or cloudy quantities which are not describable in terms of probability distributions'. Dr. Zadeh pursues a solution that applies new mathematical abstractions to what he calls fuzzy sets. In contrast, we pursue an approach that applies existing mathematical tools to the notion of a living data stream.

These predictive statements inherit their cloudy nature from the constant state of evolution inherent in a data stream. The Living Algorithm System digests data to provide a predictive cloud, whose shape shifts with each new entry. Each new data point represents change; and the constant possibility of change requires an ongoing approximation of pattern that is central to the responsiveness of living systems. As with Life, these predictive clouds are context sensitive, constantly evolving via the dynamic input from a living data stream. The urgency of response typically required of living systems demands context sensitivity. These predictive clouds reflect the immediate nature of living systems, as they move through time. As such, the ongoing and suggestive nature of the Living Algorithm's predictive cloud is ideal for describing the changeable and immediate nature of living systems.

After accomplishing her last ordeal, the Living Algorithm has now fulfilled all of the previously stated job requirements. The Living Algorithm is the ideal candidate for the position of representing the personal and dynamic nature of living systems. Her mastery of data stream mathematics renders her approach a powerful simulation of an animate system.

Let us summarize our findings and then examine some of the intriguing implications. One essential fact about living systems is that they are in a constant state of digesting data streams. The responsiveness of an organism to a data stream requires some form of information digestion that serves these notable functions:

- relates the present moment to the immediately preceding moments,
- reveals patterns (or the lack thereof) that represent these related moments
- generates rough, yet practical, predictors about the immediate future.

The Living Algorithm’s unique process of digesting data streams generates the Predictive Cloud. The Predictive Cloud satisfies the three notable functions outlined above. Accordingly, the Living Algorithm System mirrors these essential qualities that living systems require and as such is a compelling model of biological information digestion.

Could the Living Algorithm be more than a model? Could the Living Algorithm model actually be the method by which living systems process data streams? Does this method of information processing exhibit any noteworthy patterns? If noteworthy patterns do emerge, what rules are capable of generating these forms? Is it reasonable to assume that patterns must conform to some manner of rule-governed behavior?

What form might these rules take? What language is being spoken? We sense that the ideal language must be mathematical. What other language is capable of fulfilling the unique requirements of this challenging job description? Is the mathematics of data streams a language of living systems?

In order to answer these intriguing questions, we must start by discovering if there are any patterns worth noting. If we discover such sequences, we can then begin to search for the rules that govern them. We can then ask: what sort of language can express the rules that are required to generate these noteworthy patterns? And finally: are humans, specifically, or living systems, in general, subject to the grammar of this language?

Let's take a stab at some preliminary answers. Does the Living Algorithm exhibit any noteworthy patterns? Both the Pulse of Attention and the Triple Pulse are innate patterns of the Living Algorithm’s method of information processing. These mathematical patterns obey some distinct rules, as discussed in *Triple Pulse Results*. Is there any evidence that living systems are subject in any way to these rules? In the prior article stream, we examined multiple examples of how the mathematical rules of the Triple Pulse sync up with multiple experimentally verified sleep-related phenomena. Humans, at least, seem to be subject to the rules of the Triple Pulse. Is all of this evidence coincidental? Are there other factors at work? Is there a confounding variable that is waiting to be discovered? Or do these correspondences indicate that we might employ the Living Algorithm to digest information?

If the Living Algorithm is really one of the ways in which living systems digest data streams, the Living Algorithm must have evolutionary potentials as well. Else why would this computational ability be passed on from generation to generation? For a discussion of these issues, check out the next article in the stream – *The Living Algorithm's Evolutionary Potentials*.

For an insight into the Author's creative process, check out the *Living Algorithm's special Significance – Evolution of Understanding*.