Home   Science Page   Data Stream Momentum   Directionals   Root Beings   The Experiment

4.3 Beginning Neural Networks

A. A Neural Network with a Life of its Own

Now What? The Ability to be Bored

All of us prey/predators are easily able to calculate and store Average Deviations in some type of neural network. For most animals this network is only used for the predictive features mentioned above. However, for homo sapiens sapiens the neural network that stores the Average Deviation attains a life of its own. Because of this the homo sapiens sapiens has the ability to be bored.

What it means for a neural network to have a life

More on this later, but first let us briefly establish what it means for a neural network to have a life of its own. In the simplified existence of a neural network, a life means that the neural network seeks to preserve itself. To see what this means let us see what it doesn't mean. Many neural networks don't seek to preserve themselves. Memories don't seek to preserve themselves except in recollection. It is not painful, not to remember something. The memory might arise in thought, dream or conversation to preserve its neural network. Of course, one of the pains of the death of a partner is that your shared network of memories is lost. While it is not painful not to remember something, it can be painful to lose the connection with one's memories.

The Death of Neural Networks is Painful for Humans

The ability to experience pain at the death of the neural network is uniquely human. This meta-perspective includes the pain of memory loss and the need for recollection that is so human. Words, in some ways, could have been developed to avoid the pain of memory loss. We need to recollect the experience so that the neural network that we've created to commemorate an exceptional experience won't be lost. Additionally we humans create monuments, works of art, plaques, and annual awards ceremonies to commemorate exceptional events presumably so that the neural networks that were created to store the memory won't just fade away dying a natural death. Extending this line of thinking a little further. The reason humans hold onto life so much and don't just fade away easily is that the neural networks that have been created in a lifetime don't want to die and so send out messages of preservation. Taking the next step. Those who have left evidence of the memories of their neural networks etched in stone may die an easier death. The creator might be able to let go of this existence more easily than one who has maintained but not preserved.

B. Neural Networks in Action

A Return to the Point: The Theory

Enough philosophizing and back to the point. The human experiences the pain of boredom when the neural networks that it has created to compute the Average Deviation begin shrinking. Below is an animation of the growth of a two dimensional neural network based upon decay. The first picture is the first experience based upon a vertical parameter and horizontal parameter. This diagram could represent the first experience of any type of phenomenon. The second experience is overlaid upon the first. The first experience is decayed. The third experience is overlaid upon this frame. The previous is decayed or scaled down. Each successive frame is overlaid upon the conglomerate of experiences that went before while this conglomerate is faded or decayed.

The Visual Growth of a Neural Network:

1.  5.  9.

 

2.  6.  10.

 

3.  7.  11.

 

4.  8.  12.

Decaying Influences

Note how the center of the conglomerate shifts through the successive frames to the lower right. Note also that even at the end that there are evidences of the first experience although it is now quite faded. This fading is a visualization of the scaling down that occurs with the decay of time.

Another illustrated Neural Network based upon points

The above visualizations illustrate the assimilation and decay of a multi-dimensional series of events. The visualizations below illustrate the same phenomenon from a two dimensional perspective. They show the development of a Data Stream with Data points, which only have a vertical and horizontal component.

The first frame shows the first experience, X0. The second frame shows the second experience, X1 and the line that represents the change, the X1, or the difference between the two experiences. The third frame shows the average, of the first two experiences. Notice that the 2 Data Points, X0 and X1, have been completely absorbed into the average by the third frame. The visualization below shows what happens when another Data Point, X2, is added to the Data Stream.

A new difference, X2, and a new Average are generated. The Difference is between this New Data Point and the Old Average. Notice again that the individual Data Points have been absorbed into the Average. Notice also that the size of their potential impact, represented by their size, also becomes smaller because of decay.

More Data Points added to Stream with new Differences and Averages

Similarly when another Data Point, X3, is added to the Data Stream, another change, X3, and another average, , are naturally generated.

The individual Data Points are absorbed within the collective measure, while their individual impact decays. This process naturally continues when a succession of raw Data Points, XN, are added to the Data Stream. With each new Data Point comes another difference, XN, and another average, . The average represents the accumulation of all the individual Data Points. Represented below is what happens when a few more of these experiences, i.e. Data Points, are added to the Stream.

The above visualizations demonstrate what happens when Data Points, X4, X5, X6, and X7 are added to the Data Stream. Simultaneous with these Data Points comes the Data Stream of Differences, X4, X5, X6, and X7, and the Data Stream of Averages, , , , and . Note that the individual Data points fade into insignificance compared to the Averages, which represent the accumulation of experiences represented by the Data Points. This average represents the velocity of our Data Stream, a very important central tendency. {See our Data Stream Momentum Notebook for more information about Data Stream velocity.}

Data Points Disappear while Average Keeps Growing

Below is a diagram of the 12th Data Point.

Note that the individual point-experiences disappear, while the average based upon them keeps growing. As it grows, it becomes more and more stable, i.e. resistant to change. If the average is allowed to keep growing it becomes so stable that it becomes static, stationary. If however we introduce D, the Decay Factor, then the average reaches a certain size and stability but then does not keep growing. The new is added in and the old is scaled down proportionately. This was the topic of the Decaying Averages Notebook.

Higher Derivatives, Deviations and Directionals

Just as our individual Data Points generate derivative Data Streams, so do these Data Streams generate their own derivative Data Streams. While the Average Data Stream represents the accumulation of the Data Points, the Difference Data Stream generates its own average Data Stream with its own Decaying Average and its own Difference. See the diagram below.

Difference Data Stream generates both Deviations and Directionals

If the Differences are treated like a scalar, i.e. an absolute magnitude, then the Average of these Differences represents the Average Deviation. If these Differences are treated like vectors, i.e. with a direction and magnitude, then the Average of the Differences is called an Average Directional. {We explored both of these measures of central tendencies in our last Notebook, Decaying Averages.}

Even Higher Derivatives, Acceleration and Thrust

While the Difference Stream generates its own Average it also generates its own Difference Stream. These Difference Streams represent second level Deviations and Directionals. Similarly these Difference Streams generate their own Averages and Differences with their own Averages and Difference Streams ad infinitum. See the diagram below for the next level of averages and differences.

While we will disregard the individual Differences, just as we ignored the individual Data Points, we will examine these higher Derivatives in future Notebooks, appropriately called Data Stream Derivatives. Further we will explore the implications of these measures in a much later Notebook in Book 3, entitled Energy and Entropy.

C. Greater & Lesser, Ranges & Realms

Ranges & Realms

Let us set up some further definitions. We have the Range and we have the Realm. Most people live in the Realm but only the most daring wander in the Range. We now have another differentiation, the difference between the Range of Possibility and the Range of Actuality. The Difference between the Range of Possibility and the Range of Actuality is the Area of the Lesser Range. See Diagram below.

 

The Realm of Probability is broken into the Greater and Lesser Realms. The Greater Realm of Probability includes any Data that falls within one Average Deviation of the Decaying Average. The total Realm of Probability includes any Data that falls within 2 Average Deviation of the Decaying Average. The outer limits of the Range of Actuality are the Highest and Lowest Value that has ever occurred. The outer limits of the Range of Possibility are the highest and lowest valued that can be imagined. Sometimes it is called the Range of Imagination rather than the Range of Possibility.

An Example:

As an example, on an average, I sleep 6 1/2 hours of Sleep a night, the Decaying Average. I usually sleep between 6 and 7 1/2 hours a night, the Greater Realm. But sometimes I sleep up to 9 and down to 4 hours of sleep, the Lesser Realm. One time I stayed up all night and got 0 hours of sleep. The next day I was so tired that I slept 12 hours, the Lesser Range or the Range of Actuality. I can imagine sleeping 24 hours in a day, the Greater Range or the Range of Imagination. It is impossible to sleep more than 24 hours in a day, outside the Range of Possibility.

 

Home   Science Page   4. The Boredom Principle   Next Section