Chapter 1
Process Compartments as a Foundation for Ecological
Control Theory
(last
edited: May 23, 2006)
Theory of process compartments
The three transforms work like this
Bi-level emergence, mathematical and physical models
Process Compartment Hypothesis (PCH)
Stability, the hidden illusion
The new mathematics, and the new science
In Prueitt (1995), a theory of
process organization was proposed as foundation to a computationally grounded
class of generic mechanisms, algorithms, and data encoding. This theory
attempted to take into account the creation and annihilation of natural
processes localized into coherent phenomena by the cooperative behavior of
complex systems. Its formal foundation
lies in pure mathematics and in thermodynamics of “weakly” coupled oscillatory
systems having strong internal regulation and relatively weak intersystem
influences. The initial application of
this foundation was in quantum neurodynamics and in cognitive neuroscience
(following the conferences hosted by Pribram 1992, 1993, 1995, 1995
and Pribram’s books 1971 and 1991.)
A simplification of the foundation will occur as we introduce
structural ontology (Adi and Prueitt 2004).
This simplification is linked to data encoding, using the key-less hash
table [1],
and to the real time formation of structured information about the process
being modeled. The simplification is
motivated by a complex theory of natural process. The new information science retains the simplicity of data
encoding and algorithms; while being consistent with the familiar, but truly
complex, private experience with one’s own awareness. The techniques and
assertions of fields like artificial intelligence do not have this simplicity
nor naturalness.
The core assumption made, in the theory grounding structural ontology,
is that regularity of natural patterns are caused in part by tendencies to
ignore small differences. Systems,
perhaps even non-living systems, are involved in the emergence of function from
substructure. Function imposes certain
equivalences during the formation of natural category. It is the phenomenon that natural category emerges
that motivates all of the theory and observations that follow. We ask “how” natural category arises and we
use our answers to encode, into data structure, structural information about
emerging categories. The encoding is
localized as “n”-aries, as discussed in Appendix B.
In a familiar instance, we observe the tendency to remember only part
of experience and as we do this we often treat the particular as if a thing
seen before. The experience and
previous experience are categorized together and seen as being the “same
thing”. An example would be a
discriminatory behavior or preference for one type of food as opposed to
another.
A relationship between human mental imaging and action-perception
cycles is implicated as being part of the mechanisms that create experiential
category. Some scholarship about this
relationship is presented in order to help define intellectual background. This background is needed to ground a
human-centric use of machine ontology, sometimes called web-ontology [2]. The “n”-ary encoding is the first step in
developing an alternative to W3C standards.
The core insight is that action in a reacting world leads to an
observation about the consequences of changes made to a responding system and
thus informs the responding system about itself via changes made and perceived
in the system's environment.
Web-ontology has been ushered in by the Semantic Web movement and by the
related academic field of artificial intelligence. However, the origin of information design sets with a few self
appointed “knowledge engineers” and not within normal everyday usage. The “n”-ary alternative is used within
action perception cycles, where part of the cycle is embedded in normal
everyday usage.
In the last part of the twentieth century and the first part of the
twenty-first century government extensive and persistent funding of so called
software agents lead to mistakes in the design of web ontology. The mistake revolves around the notion that
computer programs can be the same as natural intelligence. The situation remains difficult to
clarify.
The theory and technology for “structural” machine ontology was
developed in the years 2001 – 2004 and is presented in later chapters. Between 2004 and 2006 we developed a working
understanding of the semantic web information technology standards. These standards are developed using
collaborative processes by the W3C and OASIS organizations. The standardization process itself is an
example of the formation of natural category, and thus the causes of the
specific standards lend insight into the formation of natural category. In this case, the observation is made that
W3C standards developed to help the information technology sector achieve
business objectives, whereas the OASIS standards were more oriented to shifting
the origin of control, in new information design, from the software vendor to
social process. [3]
How can the scientific literatures curb this business oriented
conformational process by exposing when in nature one finds truly
non-deterministic processes?
In Prueitt’s 1995 – 1996 work, an analogy to human self image is used
to communicate how natural compartmentalized processes act in a natural
setting. The analogy is broad enough to
cover any natural system, ie a complex system in Rosen’s use of language [4]. Mathematics is a simple system when seen in
isolation from its role as a communicative language of science. An outline of some mathematical concepts was
developed and expressed. This outline
is given below. The formal concepts are
related to the study of emergence and stratification of physical processes. Over the past two decades, the outline was
not filled out to the degree that it should have been. However, the 1995-1996 work seems as valid
in 2006 as it was in 1995. To finish
this work would seem to require changes in support structure so that the
mistakes persisting in artificial intelligence and semantic web communities can
be directly challenged. Recent trends
are hopeful and can depend on well established formalism and extensive
scholarship.
Mathematics, in Hilbert spaces [5],
gives description to how compartments might be studied in the context of
temporally registered energy distribution in separated spectral strata. Mathematics in Hilbert spaces can be mapped
to descriptions expressed as discrete formalism [6],
giving us a rigorous transfer of correlation analysis to data encoding in the
form of “n”-ary web ontology.
We found others working on technology and descriptions that depend on
an encoding of structural ontology into spectral strata. By structural ontology we mean actual
reality, not the abstraction that one finds in web ontology language (W3C
standards). We mean the structure of
natural category observable via direct perception or even instrumental
measurement such as with electromagnetic or optic devices.
An analogy between spectra processing in the brain and computational
process within structural ontology is to be established in later chapters to
ground a text retrieval and knowledge management paradigm for distributed
collaboration in web environments.
Chapter 10 is developed, in 2004, in collaboration with several
colleagues to reveal the nature of the Adi structural ontology for natural
language understanding, and the corresponding generalization of his development
experience. This generalization is a
“generative methodology for creating structural ontology.”
Appendix B expresses a notational system consistent with the above
remarks.
Prueitt’s 1995 theory of process compartments described what might be
necessary to the organization of experience.
First, it may be necessary that natural phenomenon self organize into
strata, and that the phenomenon in each strata emerges through an aggregation
of substrate phenomenon. Natural
category forms through a reification of functional invariances imposed during
periods of emergence. Human memory
systems may be responsive to managing the aggregation process. The real situation cannot be expected to be
simple. Complexity is defined as a
property of real situations, particularly during the emergence of function [7].
Complexity as defined by Robert Rosen seems necessary. A non-deterministic aspect to the emergence
of function may be necessary if the exercise of choice occurs. For us this is not a religious concept, but
one developed from direct experience not only with our private observations
about human choices by with underlying physical, and non-living,
properties. Non-deterministic
phenomenon may always be present anytime an emergence of something is occurring. We are reminded that many of the real
mechanisms on which human choice depends are observed in non-living
systems. Living systems may be expected
to take advantage of underlying physical phenomenon. Part of the mechanisms involved is what we call “awareness”.
A specific number of mechanisms are involved in the expression of
attentional behavior, such as awareness.
In stratified theory, we represent these mechanisms as acting with
elements of substructural memory resource.
The aggregations of substructural resources are also thought to be by
environmental constraints. The mutual
entailments, the full set of causes, are thought to create meta-stable
compartments experienced as mental events.
These compartments are physical phenomenon and should be observable if
one looks in a proper way. However,
scientific reductionism asserted that logical coherence must be used to express
the full set of causes, and this just seems to not be the way the natural world
is organized. Again, the notion of
coherence can be understood as something that is build up from phase coherence
seen in electromagnetic fields. To
expect that all “fields” must be forced into a single coherence may be
logically satisfying to some, but the impact that such an imposition would have,
if this were possible, would be an annihilation of everything. We do not see this annihilation phenomenon
in astrophysics. What we see is both
coherence and shifts between complex systems while coherence is challenged and
systems undergo change.
The contents of experience, however, may not be directly observable
except by the person him or her self who is “experiencing” awareness. In stratified theory, the compartment is
seen as a physical phenomenon with stability far from thermodynamic equilibrium. The compartment’s stability depends on the
substrate as well as the environment.
Simply stated, the compartment emerges from a substructure and is
constrained in its form and function by a set of laws imposed as environmental
influences. These environmental influences include constraints imposed by
neurophysiology. Underlying the stability is phase coherence [8]
Using notational language about the structural ontology of process
compartments, the relationship between mental imaging and action-perception
cycles is modeled as including a set of generic mechanisms. We then use analogy to develop the
stratified web ontology first proposed by Prueitt in 1999 [9].
One conjectures, that in biological systems, these mechanisms are
available to an intentional system behaving within ecology. Based on an extension of Soviet-era
cybernetic, we developed the specific set of structure-to-function
relationships, involved in a specific phenomenon. We will see in later chapters the development of a “generative
methodology” designed to create a model of the cross scale mechanisms that are
involved in the emergence of compartments of various types. The generative methodology uses formalism
that describes these generic mechanisms.
Again, the key concept is that during emergence the needs of the
environment shape the potential that develops due to the aggregation of
substrate. One sees this phenomenon in
physical chemistry, when molecules form and expression functional
properties. In organic chemistry, the
cross scale expression is often far most complex. An example is gene and cell signal pathways expression [10].
We conjecture that the emergence of any type of physical phenomenon can
be modeled in complex situations, specifically in the situations involved in
the formation of mental event. We
argue, philosophically, that the regularity of any of these naturally occurring
mechanisms is reinforced due to minimal energy and effort constraints that must
apply to any living system competing for survival. We mean by this that the
living system develops a perceptual system that relies on as much regularity in
response requirements as possible. From
this argument we have developed an approach towards deploying computer based
ontology with three layers of abstraction.
The result is structural ontology having a simple data encoding. The data encoding is so much simpler that
W3C web ontology language standards that the eventual adoption of these
encoding methods is seen problematic to the current information technology
industries. Adoption barriers exist for
this reason.
To understand structural ontology we need description and motivation
for generative methodology. A
generative methodology is used to develop structural ontology through an
iterative process. The motivation for
generative methodology comes from a number of origins. Most of the material in this book is
designed to reveal these origins. In
2006 an OASIS standard [11]
was adopted that specifies examples of computer mediated development of web
ontology templates, and the necessity for periodically involving human
decision-making. The primary concept
being introduced into information technology standards by the OASIS
organization is that choice points must be identified through a process of
human decision making, and that resulting computer based web ontology is then
usable as part of an automated production of multiple possible outcomes. The natural emergence of category is finding
an expression using information production standards. The OASIS standards places community based resources in the
hands of those who seek to shift the origin of the design of social information
from the few to the many.
Starting in 1991, on obtaining a research position at Georgetown
University [12], we reached
into several disciplines in order to ground the generative methodology. Ecological psychology, founded through the
work by J. J. Gibson in the 1950s and 1960, was a good place to start. Ecological control theory emerges from an
understanding of the generic principles involved in the relationship between
action and perception. To be sure, the
concept of Darwinian evolution is modified to include mechanisms that produce
replication of functional responses to perception and provide regularity in the
acting and behaving of living systems.
It is these replicated functional responses that we look for in applying
the generative methodology. Prueitt’s
efforts, 1985 – 1988, on theoretical immunology and on formal models of
biological neural networks were also instrumental in the formulation of
generative methodology.
According to theory developed within the ecological psychology academic
community, action-perception cycles generate compartments out of a system image
in the presence of external stimuli.
For Prueitt, compartments always meant organizational
stratification. Stratified theory
defines "system image" as a reflection of temporal non-locality
realized in a field manifold and having meta-stable potential localities
(attractor regions), and ecological affordance (Kugler et al, 1990). The system image is thought to encode into
some part of its mechanisms information about these attractor regions. How this is done was one of the open
questions.
The notion of a substructure, from which material is aggregated, is
essential to how we formulate the theory behind the generative
methodology. It is the entanglement of
the consequences of those elements of substructure that are present locally
that forms a key part of what types of systems develop based on the disposition
and natures of the basins of attraction that come to exist in the field
manifold. And, to carry this model a
bit further, the basins of attraction become the center of attraction for
specific phenomenon, each having a complex nature. By complex we mean a system having a hidden endophysical reality
and existing within an interacting ecosystem.
The production of a formal theory has to address complexity carefully,
for reasons that are explored in detail in Chapter 2.
In 1995, Prueitt’s notion of system image was in reality little more
than a metaphor, but one that was essential to what followed next. He realized
then that naturally occurring compartments form from these mechanisms due to
affordance (e.g.: ecological need) over periods of specific duration. A great deal of Prueitt’s (1985- 1995) work
is based on a study of the extensive work by Pribram. Pribram’s work should be consulted for a more in depth review of
the scientific literature about human brain function. If the reader is lost in reading about Prueitt’s concept’s, it is
best that the appropriate background in Pribram’s work be mastered. Pribram indicated that the affordances in
mental images of plans and behavior are as much a product of a
"perceptional" affordance as of the environmental affordance. Thus the issue of anticipation and memory is
opened to examination. The long
discussions, in the years 1987 – 1995, between Pribram and Prueitt lead Prueitt
to understand not only the nuances of Pribram’s philosophy of life and science,
which was for Pribram the same thing, but also to have a sense of the physical
grounding of human perception, awareness and cognition in physical
reality. Pribram called this
“scientific realism”.
Prueitt examined the role that memory and anticipation has in the
expression of human intelligence during the years 1987 - 2004. Human memory
research and cognitive neuroscience was studied as well as the completion of a
study on the mathematical models of neural and immune system function.
It is perhaps expected that the limits of classical logic and physical
science were found to be a restriction on his investigations. These limitations were understood by an
examination of the works of bio-mathematician and category theorist, Robert
Rosen. Progress in synthesis was made
during an extended discussion with Peter Kugler, 1991-1994, on anticipatory
mechanisms and limitations to formalism.
Kugler had made a study of Rosen’s work during his studies at the Beckman
Institute (1986-1992). (need detail).
For example, his examination identified extensive evidence that
learning occurs sometimes without awareness, i.e., the clinically investigated
phenomenon of blind sight. A technical
discussion on blind sight research literature is deferred. The point to this chapter is that many
layers of scientific evidence indicates that the internal image of self creates
mechanisms that do in fact conserve the conservation laws known by physical
science, but impose additional constraints in the exercise of awareness.
The mechanisms operate during the emergence of forms. Substructural constraints are conformed to
specific regularly existing forms, or patterns. The patterns are recognized by the system of systems as a
function needed in accordance with affordances. These affordances are not merely defined by the conservation
laws, but are also defined by constraints imposed from the top down, i. e.,
from the collection of system images acting together as part of the system of
systems. The top down constrains shape the formation into one of the patterns
that fits within the anticipatory constraints imposed by the system of
systems.
The notion of ecological affordance was coined by J.J. Gibson (1979) to refer to
structural invariance as perceived within the regular flow of information,
organized energy fluctuations in the retina layer and interior structures
through the optic system. Initially this concept was oriented, by Gibson's
behavioral training, towards the external world. In Prueitt’s theory of process
compartments Gibson’s affordance notion is extended into a theory of systems
within systems, or a system of systems.
System emergence is controlled under a non-local coupling expressed by
the two complementary notions of "external affordance" and
"internal affordance." A proper discussion can be made regarding this
internal/external difference by using the endophysics and exophysics language
introduced by Rossler. An extended
discussion on this was made with Pribram over the period 1987 - 1995.
Affordance is a more generalized notion that the notion of
entailment. A specific affordance
allows other entailments to become realized is a way that also fulfills the
affordance. The door way affords
walking through. The physical
conversation laws are fulfilled by not walking through the door, or walking
through the door. However, the
affordance allowed by the door is fulfilled when intentionality, choice, leads
one to walk through the door.
The complementary notions, of internal and external affordance, are not
separable in real existence. They are
notions that are abstract in nature. In
formal modeling the representation of affordance and entailment often become
separated and lifeless. The induction
of the formalism leaves a great deal of information behind. The formalism is an abstraction, and in this
case only roughly captures the essence of the real world phenomenon. This is to be compensated for as we try to
understand how middle world event are supported by the endo-physics of those
real things, complex systems with internal and external realities, as they are
manipulated by the common exo-physics.
These considerations led Prueitt to hypothesize about how machine
encoding of invariant data patterns might follow the “architecture” of the
human brain in a way that is not expected from the neural network
literature. He thought, that the memory
mechanisms and the anticipatory mechanisms might be developed separately as
data structures encoded as n-aries, i.e., in the form
< r, a(1), a(2) , . . . , a(n).
Beginning in 1994, he began to believe that formalisms for the
endo-physics and the exo-physics might be separately developed and then
combined properly. As the combination
occurs in the present moment, as in the entanglement of memory and
anticipation, a just-in-time differential formalism might give a good estimate
about the characteristics of self-expression that we know as
“intelligence”. Such formalism was
found in 2003. The description of
differential and formative ontology (Prueitt, 2003) will be addressed in later
chapters.
The analytic position is not easy.
For example, we can think of the system image as the collection of all
affordances, and yet the notion of collection is somehow not correct. We deal
with the paradox not only of the set of all sets, but we bring to the
discussion almost all classical paradoxes.
The system image should be formally represented as the interface
between an endophysics of the internal causes of the compartment (its atoms)
and the exophysics of the environmental demands. But an analytic treatment has been difficult to find. Very few academics have thought out the
issue of stratification and entanglement in this way, although many have agreed
with Prueitt that the differential and formative ontology has interesting
applications in information science.
The academy has not been accepting of the changes that are required to
mathematics and to science. The
interface relationship was treated in V. Finn's work on quasi axiomatic theory
(see Chapter 2, section 2), but the viewpoint is so deeply grounded in systems
theory and complexity that the relationship, as expressed in his work, has not
yet been received within the American scientific community.
It is important for historical reasons to view system orientation
towards temporal invariance as an intertwined relationship between observations
and observed. In fact, this temporal relationship is seen as a fundamental one
in the development of the perception of objects.
(Author’s
note: In a final draft we will add references here and some discussion of
memory and perception research.)
One can imagine that adjusting the relative phase between energy
spectral patterns in the optic flow delineates temporal invariance in
representational mechanisms [13].
Levine and Prueitt (1989, 1993) described, using Hilbert mathematics,
non-specific arousal as an attentional reset mechanism. How else could one explain visual
perception? Visual perception is in the
now, without recent past images in the perceptual field, and yet how we seen is
subject to what we have seen in the past.
Furthermore, one can assume that the delineation of temporal invariance
makes it possible for memory stores to encode patterns separately from a
continuous recording of the full experience of events. This set of invariance,
the elements of which we call memory artifacts, has an average duration and
thus must be the consequence of some type of temporal compression. Pribram returns to the discussion of Gabor
functions at this point in order to create a convolution over some extent of
space and matter (Bruce McLennon, 1993). As one can see form material in
Appendix B, the convolution plays the essential role in processing data
structure encoded in the key-less hash table.
The convolution acts to combine elements that are similar or that have
some binding that requires elements be placed together. Similarity is a
function of many things, all or most of which is developed as categorical abstraction
and expressed in the “n”-ary data structure.
Even with convolution operators, this technology is far simpler than the
W3C type web ontology, using ontology web language and ontology inference layer
standards.
If the combining process is not cross scale then we have an instance of
categorical abstraction (cA). If the
combining is an aggregation into a whole that has functional properties at a
higher level of organization, then this is an instance of event chemistry
(eC). The difference between cA and eC
is in the perspective one takes.
Categorical abstraction has no notion of coherence, only localized
facts. Event chemistry, on the other
hand must express strong coherence even if the field’s basins of attractions,
and environmental affordance, are not a single coherent whole. Abstraction involves the reduction of
information. Events add information
about functional properties and consequences arising from the pragmatics of a
specific situation.
The compression of aspects of events into natural category makes use of
what mathematicians call the law of large numbers. The artifacts are thus
statistical in nature since multiple occurrences must be part of experiences
before these artifacts can be encoded as a memory substrate. Many open questions
are revealed within this discussion.
The formation of natural category is bound together with a dual
process. The emergence of events
imposes on category formation the situational reality, the distribution of
substance and the “other” processes surrounding that specific event. Like the mechanisms involved in natural
language formation and use, natural categories are shaped by the reality into
which categories are expressed. As
natural category forms, the nature of the category is tested and tests the
re-expression at future time. The
emergence of a specific event; however, is far from a purely statistical
phenomenon. In 2002, Prueitt developed categorical
abstraction and event chemistry based on the above remarks.
In our revision of the author's original conference publication
(Prueitt, 1995), we advance an argument, originally given in Prueitt (1997; see
also Chapter 3). The argument is that the human memory store is built up in a
metabolic substrate that is composed of protein conformational states and
metabolic reaction circuits. This
argument creates an alternative to the conjecture that the human memory store
is in the connectivity of neurons, as is commonly asserted.
Our viewpoint follows from the scholarship of Pribram (1971; 1991), and
others in the fields of neuropsychology and systems theory. In this viewpoint,
the memory store is composed of radiant energy guided conformational circuits
at several levels of process organization; metabolic, protein conformation, and
quantum. We will also see in the author's interpretation of Russian
quasi-axiomatic theory that a tri-level computational architecture more
completely models the relationship between experience, memory and behavior,
than does models of all or none neural computation in neural networks. This interpretation suggests how one might
eventually produce machine intelligence that interacts with the natural world
in ways not accommodated by the silicon processors of the early part of the
twenty first century.
According to the work presented in this volume, it is to be believed
that, in all natural intelligence a primitive measurement, without memory, of
the environment occurs through a generic mechanism. The mechanism must be
available to metabolic repair cycles expressed in living processes. Cell signal pathway expression is one
example of science that must acknowledge the interplay between non-locality and
locality. A review of this literature
is encouraged, but we will not take the time to do this review now.
We conjecture that, as a generic fact, a measurement mechanism is
derived from "opponency induced symmetry". There is evidence from many sources that opponency mechanisms are
responsible for many of the processes that induces mental imaging. In particular, following Penrose, we suggest
that there is a process that cannot be modeled as an algorithm involved in the
emergence. This process is called “self
orchestrated collapse”. Again, the
literature here is extensive and diffused, so we will not go into this as fully
as might be undertaken.
During the induction of a mental state, additional constraints are
imposed by the nature of metabolic processes that are occurring as one part of
the casual substrate for the images that form. The image is thus the natural
emergent compartment through which specific function needed by the biological
system is achieved. The contents of the
image is "informed" through memory and the sensory experience and
enfolded into a mental event. Through a delay in symmetry induction the system
becomes capable of introspection and awareness (Pribram, personal
communication).
In stratified theory, there are always at least two perspectives. From
the one perspective, the total energy of a system is concentrated at a point in
time space, and from the other perspective the system is responding to the
distribution of energy over a longer period of time, as encoded in a frequency
spectrum. Locality and non-locality
co-exists as a principle of nature, not only in particle physics.
Because of the time scale, one can assume that the coupling of
substructural processes occurs at the micro-process level resulting in
discordant interaction and a reshaping of the energy manifold at the macro
level. The difference in frequency expression, between micro and macro
processes is, we assume, what is measured by the orienting nature of human
awareness. Of course this has been a very difficult problem.
The measurement process is not yet awareness, but rather an attentional
phenomenon that is more than the mere reaction rates of metabolic process (as
Grossberg's embedding field theory assumes[14]).
Understanding the differences between awareness and measurement occupied the
author’s time over the past fifteen years, particularly in my many wonderful
conversations with Peter Kugler. Over this period of years, Kugler's focus on
the measurement problem changed. His initial focus was on developing mechanical
systems that one might hope would simulate the measurement of the environment
by a biological system (Kugler, 1987). Starting in the mid 1990s he began to
focus more and more on a deep analysis about what perceptual measurement IS
NOT, and then to an analysis of the nature of science, logic and art
(unpublished class notes). The nature of his book is deeply enriched by Dr.
Kugler's friendship. However our reflections about the nature of perceptual
measurement remains unfinished.
Although theoretical issues are often discussed in the chapters of this
book, we will be most concerned about the proper scientific and formal
grounding of computational Knowledge Management (KM) technology, or what we now
call knowledge technology. This grounding is made in the framework of a
tri-level architecture, in early chapters, and then extended to the generative
methodology for producing structural ontology.
So we will return to this theme often.
We will say at the outset that the tri-level architecture is a
convenient myth that Prueitt created.
He did this on his own. He used
this myth to create the basis for a new information technology. As myths go,
this is certainly more credible than the theories of artificial
intelligence. Knowing that the
tri-level architecture is a convenient myth is important. The tri-level architecture only assumes that
the measurement problem, and the consequences of measurement - e.g.
representation, is sufficiently managed so that some new economic value from
knowledge management using the tri-level architecture can be demonstrated. Perhaps one rational for discussion of the
measurement problem is to remind the reader that a proper solution to the
problem of representation is not often easy.
The scientific basis is not listened to, due to the power of scientific
reductionism. However, if a technology
were to be finished based on the tri-level architecture, it might be that the
scientific re-vision might follow.
Natural science knows a great deal about measurement and representation
in a perceptional system; for example as indicated in the books by Pribram and
his colleagues. We look at neuropsychology through the eyes of his work,
because Pribram is the only neuroscientist who is able to claim a theory of the
whole brain, except perhaps Changeux, Freeman or Edelman or Grossberg. From our
interpretation of Pribram's work we feel able to ground the tri-level
architecture in experimental neuroscience and in what Pribram, and many of his
colleagues, refers to as quantum neuropsychology. Pribram's work suggests to us
that "perceptional measurement" results in a cross level
transformation of energy that is subject to some class of patterns that depend
on co-occurrence of energy distributions.
During a course taught by Pribram at Georgetown University (Spring,
1999), he described a series of three sets of Fourier - Inverse Fourier
transforms of the optic flow.
The process starts with an inverse Fourier at the retina lens. Here the
scattered light of the environment is focused into a retinal process that
builds an energy manifold through the action of what is modeled with a forward
Fourier transform. The physical processes in the brain are not as simple as the
Fourier transform that is involved in the development of a hologram with glass
lens. The retinal process is a
metabolic process that has complex protein conformational reaction circuits.
Note that quantum mechanical processes are involved in the absorption of
protons by redopsin molecules, in the retina, and that this single class of
events must be the gateway events in the metabolic circuits that produce a
single distributed manifold and coherent awareness. The redopsin molecules have two metastable states, a high-energy
and low-energy state. The high-energy
state contributes to a field potential and when sufficient molecules have been
pushed into this field potential then the field itself can be taken up by
innervating dendrites projecting from the Lateral Geniculate Nucleus.
The resulting energy manifold is sampled by the axonal dendrites of the
Lateral Geniculate Nucleus (LGN). The
LGN is half way between the eyes and the visual cortex. The neurons of the LGN provide a sequence of
non-linear processing in rout to a re-spreading of energy/information into the
layers of the cortex. This spread is the second Fourier transform and becomes
linear as the information, including timing information, is encoded in the
frequency spectrum. The reason why it is a Fourier is due to the underlying
physics of lens and the species (Darwinian) need for effortless data
fusion. In the linear spectral domain
data fusion is merely concurrence of energy fields. The process control
mechanisms merely need to push the two energy fields together. Nature finds an
exceedingly simply solution to the difficult problem of data fusion.
The third linear transform produces object consistencies that we
perceive as objects in the world. This third transform occurs over a period of
time and involves movement in space-time. The author interprets the
neuropsychology to mean that the third transform acquires code from metabolic
processes occurring in the limbic system and in the association cortex and
spreads this code across many brain regions.
Code selection involves the induction of a set of basis elements for
the Fourier, or Fourier like, spectrum (as channels or dimensions in phase
space) and a phase value from the underlying metabolic circuits. The physical
processes that we model as a forward transform distributes the results of
visual processing and integrates context and pragmatics into the field. The
field then must be sampled by one more inverse transforms from the spectral
domain to produce a specific recognition of object invariance. The nature of the pragmatic axis is perhaps
the most difficult to model, but for our purposes we say that the pragmatic
axis is that which exists only in the present moment.
It is important to note that the model having linear transforms may
actually be refined and made better by a model having a non-communitive
composition of many non-linear transforms. Each of these transforms is
"cross scale", and involves the measurement of "beable"
(term taken from David Bell's work on non-locality in quantum mechanics) like
phenomena, from substructure. Modeling
these processes really takes us beyond the “normal” mathematics, what the
mathematician calls Hilbert mathematics.
The question that the author has been concerned with is about how to
extend or modify Hilbert mathematics. The
stratified theory has been only a partial an answer to this question.
The reader will see a stratified architecture expressed in the logic
that is presented in later chapters of this book, in particular in the chapter
on Mill’s logic. The stratified viewpoint is not a mainstream viewpoint. In fact, mathematics could be applied quite
differently than it is if other viewpoints were common. For example, most mathematicians are trained
that linear transforms are merely approximations to the more important
non-linear transforms. Our analysis of the neuroscience suggests that linear is
desirable, in a Darwinian sense, because of the efficiencies of linear
processing in the spectral domain. Thus the conjecture on the linear process
being the aggregated result of many non-linear processes, if true, would be
counter to the standard viewpoint.
There are many other issues that neuroscience have with classical
mathematicians.
The reader may wish to reflect on the earlier discussion about
coherence, and non-locality. Coherence
and linearity are made from the same cloth.
Non-linearity may be suggesting that “other” systems are close by and
are interfering with the coherence locally.
Linearity and non-linearity may both be a function of localization,
whereas non-locality may not be describable in terms of ordering. This fact, or observation, may be the key to
convolutional processing of Ontology referential bases (see Appendix B.) Clearly these issues reveal important open
questions.
A case can be made that the physical mechanisms that allow human
induction delays the other wise regular propagation of radiant energy (see
Eccles, 1993). Perhaps it is also true,
that in highly intentional systems, the duration of emergence is extended to
allow self-image to interact more strongly during the formation of a symmetric
barrier to action. Eccles claimed that
induced symmetry is globally linked via some holistic mechanism to produce a
mind-body interface and to thus establish the means for spiritual awareness by
biological systems. He argued that the human synaptic structures have unique,
to human, characteristics that intensify this possible spiritual awareness [15].
The mechanisms also produce cognitive ability far in excess of non-human animal
through a multiple level transformation of energy distribution. In this way,
plans and goals are incorporated into the substance of the resulting mental
compartment. This delay should be empirically observed as irregularities in
processing of regular metabolic activities occurring in the neurons and
associated gilia. So the case being made can be tested in a rigorous fashion.
Perhaps it is important to note that the framing of the issue of
dynamics in terms of linear and non-linear might be less relevant, than the
main stream supposes, to the ultimate cause of understanding what we do not now
understand about perceptual processes in complex systems. The cross scale
process is neither linear nor nonlinear because the definition of the linear or
the non-linear systems requires that a set of observables have been nominated
before a formal mathematical problem is set up. The cross scale processes have no known formal
representation.
Measurement and representation are not the only two classes of
difficult investigations. In the
previous section we have outlined an interpretation of part of the vast
neuroscience and mathematics literatures.
In this interpretation we find evidence that the physical stratification
of processes that we do in fact find in physics is inherited in the process
that occur in the brain. The
stratification suggests that an understanding of emergence is essential to
understanding the nature of perception and cognition.
We also need to talk about and have some appreciation of the notion of
system image. Our study of cognitive
processes is undertaken in a quest for new information technology that is
different from what we know today. In
this study of cognitive processes, we see the system image as an assembly of
coherent associations, made from the spectral domain of energy frequencies.
into those metabolic processes that are available at the moment the awareness
occurs. We conjectured that the association induces a non-localized action
during an initial period of energy compartment formation. This non-localized
action is treated in our theory as from a “system image”. An autonomous core is the origin of a
periodic expression on several levels of organization. The expression of this
core is the system image. The system
image shapes the process of emergence through synchronization of sub-processes.
Of course, a correlation between induction and the introspection of awareness,
as well as intentional control, might also be found in the core. These
mechanisms are cross scale in nature and thus highly speculative, but
nevertheless are reflected in the tri-level architecture.
In 1995, Prueitt conjectured that the nature of the self-image is not
fully understood by anyone, but one can easily see its role in the formation of
very different perceptions of the same world by different people. One might
conjecture that compartmental emergence takes place concurrent with perceptual
measurement and results in the formation, or movement, of initial conditions
for actual metabolic processes in the brain system. The metabolic processes
would be composed of available reactants that are called on, by process
gradients, to fulfill the image of self. In this picture of self, we again see
the three levels; substructure, the compartment and other compartments that
interact with each other, and the intervening set of environmental affordances
that reflects how compartments behave in its environment.
The self-image "sits over at the side" and represents the
knowledge of the world as well as autonomous intentionality and whole system
continuity. We can observe evidence that living systems have knowledge and
intentionality, he argued. And even non-living systems, if such a term makes
sense, have continuity. However, this image may or may not be observable.
Likely it is not observable, except to the system that it is the core of. Nor
is self-image a statistical artifact. Each image is unique and the uniqueness
is also likely to be the partial cause of its influence on other processes. Its
expression is mixed with the assembly of substructural elements and the
constraints of ultrastructure, but it seems to have an "other-than"
status. A language of self-image expression might be developed through the
study of cell morphology and physical ontology, but the methodology for the
discovery of the full nature of cross scale phenomena is likely to quite
different that discovery methodologies that can be successfully employed
regarding the substructure or the ultrastructure. Self-image is neither the
aggregation of elements from substructure nor the constraint of ultrastructure,
nor is it proper to identify it as the autonomous whole that we observe as an
element of the middle world. Each is
unique. We understand self image only
informally and only with a great deal of knowledge and wisdom.
As we consider the essence of object recognition, we can assume that
the problem of object representation has been solved in some way, or in some
partial sense. Using the tri-level architecture, recognition of things can
refer to things in anyone of the three levels. In each case a different part of
the representation problem has to be solved. The architecture is based on
metaphor between these things and the neuropsychology cited. Naive recognition,
the recognition of things in the world by a perceptional process, occurs when
memory subfeatures combinatorically express to produce features, invariant
sets, of the compartment that supports the mental awareness.
This expression results in the initialization of boundary conditions
(initial conditions) within the compartment, and the compartment supports an
energy-spectrum phase-coherent mental event. This naive recognition does not
always require a symbol, and in fact the use of a symbol may reduce the
awareness artificially to recognition of some aspect of ultrastructure or some
element of substructure.
It is likely that two different symbol systems are necessary to produce
a separate awareness of substructure, and make a clear distinction to what is
ultrastructure and what is substructure.
Would such formalism be regarded as “mathematics” and the observations
made using this formalism as “science”?
It is certainly the case that science works through a process of
"encoding" empirical observations into notation. Symbols systems and
a common interpretation of the meaning of the symbols is the basis for the
development of science. But Prigogine [16]
and others question the possibility of having a formal system revealing a
complete description of the processes that support emergence and the emergent
consequences.
The tri-level architecture suggests that separate symbol systems can be
developed, one for substructure and the other for ultrastructure. One would
think, after reviewing the architecture, that two separate sciences, each based
on quite different principles, might evolve in the near future. The two objects
of investigation are quite different, in that the substructure is statistical
in nature and the ultrastructure is categorical in nature. It is perhaps an
accident of history that statistical artifacts have been more developed than
have categorical artifacts. We say this because category science should be as
useful as statistics, but for reasons of cultural expression statistical
science seems to have been developed first. As a result of this maturity, statistical
systems give us the best possibility for verification of some aspects of the
tri-level architecture. For example, the effect of cross scale movement of
initial conditions should be observed as discontinuous jumps in the predicted
path of metabolic reactions.
These jumps in measure of metabolic expression should, in theory,
produce sequences of discrete middle world events, modeled by event chains in
which the notion that one event causes the consequent event is incomplete. The
effect might be seen during the characteristic reaction of ATP energy
conversion resulting in metastable state alterations in protein. Again
conjectures of this type can be tested in a rigorous fashion. Discontinuity in
neural processing of a stimulus signal is discussed in Chapter 4.
Representation of substructure can be via statistical artifacts, but
can never be perfect because, in classical statistics, categories are formed
based on similarities and not dissimilarities. The Russian work in
quasi-axiomatic theory does make use of "negative knowledge" but this
use of negative knowledge leads to complications that are not resolved by
anyone. If both similarity and dissimilarity is completely accounted for then
each event might be mapped to exactly one category and there would be too many
categories to make sense of the world. Practical consideration leads us away
from David Hume's enigma regarding the know ability to know the world. The
world has not been made sense of in this way.
The notion of an emergent energy landscape, or manifold, is central to
understanding Prueitt’s notion of a process compartment. In the next section we
will use simplistic mathematical models as a means to illustrate this notion.
The model does not have a counter-part to the notion of self, or system, image,
and is thus bi-level in nature. The mathematical model is briefly introduced
here and then extended in the later part of Chapter 2.
This section considers a simple mathematical framework for studying
emergence.
Consider a small or large set of coupled oscillators:
dj/dt = w + SUM( c G(j)),
(with j the oscillation phase, w the intrinsic (constant) oscillation,
c the coupling and G any non-linear function), having various types of network
connections (architectures) and initial conditions. The architecture would be
expressed in coupling that may be positive or negative. The coupling may also
be variable and reflect certain regular features of the circuit dynamics of
metabolic reactions.
These systems are observed to partially or fully synchronize the
relative phase of individual oscillations. We assume that the system develops
systemic and regular behavior that acts as a partial control over the values
that the coupling takes on. This control is from the higher level (of
organization), of the two levels, to the lower lever.
In some cases the resulting phase locking between oscillators is easily
seen to be trajectories in the very simple dissipative system
H(j,dj/dt) = 1/2 m dj/dt 2+ 1/2 k j 2
where k and m are constants, and j is phase (of internal to external
expression), and t is time.
In this simple case, the intrinsic oscillation of each trajectory can
be mapped to closed loops (circles) on the surface of the manifold described by
the above equation as a n + 1 dimensional parabola. These loops then form the
basis for the oscillators as seen from one additional level of organization.
The formalism would appear to imply the possibility of an infinity of numbers
of levels, each created only from the entanglement from a single substructure.
When n = 1 this is the unforced, undamped pendulum. When n > 1, and
there is a coupling term between the oscillators, then the oscillators are
merely coupled harmonic pendulums.
The harmonic case is simple. Consider a system of rotators that is
fully connected, the connection strengths constant, and the initial phase
conditions evenly distributed around a circle. The oscillators will phase lock
into their initial intrinsic oscillations since connectivity averages out over
the entire architecture. Some computer simulations are sufficient to bear this
out.
Harmonious interaction occurs when the intrinsic oscillations and the
coupling walk in tandem. Discordant interaction between subsystems is
correlative to opponency based symmetry induction and consequent formation of a
new compartment, in the form of a basin of attraction in the energy manifold.
In either case, the interaction occurs through excitation, inhibition and
indirect, or allosteric, causes from a substrate consisting of internal to
external expressing phase systems. The excitatory or inhibitory causes are
often expressed as a field effect; whereas allosteric effects are propagated in
a cascade of micro-events that are part of well-established metabolic circuits.
This is particularly well illustrated in cell membrane.
Many of the open problems in fields related to situational analysis,
control theory and medical/psychological/ecological therapy are related to a
systemic response to discordant interaction. Thus, it is important to have the
class of formal systems, described above, produce logical (formal) compartments
that reflect natural mechanism. A computing environment appropriate to the
simulation of cross scale phase locking allows us to create conditions whereby
the emerging manifold has strange attractors as discussed by Kowalski et al
(1989; 1991) [17] and others.
As important to semiotics, the creation of such a manifold can be observed
directly in the presence of symmetry breaking "seeds" derived jointly
from system subfeatural competence, i.e., the ability of subfeatures when
combined, and system image. The phenomenon of symmetry collapse is where
interpretation can be placed, and where a second order cybernetic system (a
system image controlling induction) is needed (Chapter 3, Section 5).
The induction of symmetry is indicated by several schools of
scholarship as a mechanism arising from specific molecular structure and neuro
anatomical features. Symmetry induction
is explicit in the Adi structural ontology (Adi and Prueitt, 2004) as well as
in the principle of Human-centric Information Production (HIP) developed by
Prueitt (2001 – 2004). Adi’s
observations regarding an ontological substructure to language were quite
independently developed and expressed in a 1086 patent. The design principles of HIP are discussed
further in later chapters.
For the purposes of computational analysis, symmetry in these systems
can be broken in several ways: (1) un-evening the distribution of initial
conditions for the phases j, (2) un-evening the connection strengths w, and (3)
time varying the intrinsic rotation of individual oscillators. These three
conditioning factors of the suggested class of simulations enable a theory of
programming for the tri-level architecture. This theory of programming has yet
to be capitalized on. We conjecture that a derived theory of programming can be
incorporated as a new computing technology based on optical computing and
Knowledge Management. The key to this new information technology is the new
OASIS standards on service blueprints and human choice points [18].
In the physical theory, energy symmetry is broken while opponent based
induction measures informational invariance in an incoming data stream. The
opponent processing is bottom up, from memory stores, and top down from
category policies. In the tri-level architecture, a voting procedure manages
this process in a way that seems to be grounded in the interpretations about
human memory given in Chapter 3. Within the frame of neuropsychology change in
memory access and category policy might correspond to a shift attentional
focus. The resulting process compartment has a manifold with invariant sets
that reflect both input data invariance and subfeatural combinatorics that
results because there is a specific memory store being composed within a
specific category policy.
Initial conditions are created that complete patterns intrinsic to both
the data stream and subfeatural memory, and produce an internal representation
to be read by a system downstream. The interpretation occurs in a context
supplied within the physical processes occurring in the process stream and is
related to both the available category policies and the measurement of pressure
on the policies to change in accommodation to environmental non-stationarity.
There are two focuses to this book, the technology that leads to
structural ontology and anticipatory technology, and the science that we must
ground this new technical within. The author already suspected this
architecture in 1995 when the first draft of this chapter was written. Now we
know that the architecture has a realization in a specific type of tri-level
machine intelligence that we will see developed and demonstrated in the final
chapters of this book. The architecture
provides structured, and stratified, ontology and anticipatory mechanisms. In the near future, we believe that many
different computational system architectures will be modeled after complex
cross scale phase locking that occurs and has stable symmetric induction. A
mutual induction will connect the computer’s data mining capabilities with the
human centric information production that structural ontologies support. Pribram's theory of human perception,
presented two sections above is illustrative of the natural science that
suggests this future.
PCH (statement): Temporal coherence is produced by systems that are
stratified into numerous levels and produce compartmentalized energy manifolds.
Connectionist models assume that a neural system builds internal
representations with geometrical and algebraic isomorphism to temporal spatial
invariance in the world. The PCH provides a common foundation for
investigations of internal representations of this type. It also provides a
means to study the linear and non-linear transforms that serve as formal models
of these representations. Models are
useful in our attempts to understand the interaction between independent
systems supporting these representations. The PCH assumes a stratified
organization; with compartments emerging from substrata and globally acting
images providing top down constraints in the form of rules and laws. In all
cases, compartments exist as embedded and transient complex subsystems of other
compartments. The PCH assumes an embedded and stratified organization whose
stability comes from substructure and whose plasticity comes from adaptive
categorization.
In theoretical constructs that model the PCH, compartmentalized processes,
or process compartments, are linked processes that are localized in space and
time. Their definition, in this way, shifts the ambiguity from the term
"process compartment" to the two terms "process" and
"localization," separating the description of a compartment to a
localization process and to autonomous evolution in time from initial
conditions. This autonomous evolution is within an integrated whole. The
evolutionary control of micro-process is entailed by a non-local (holonomic)
influence. This control by a non-local constraint is what makes the tri-level
architecture quite different from connectionist neural networks. The tri-level
architecture is more like what is now being referred to as "evolutionary
programming".
A natural characteristic generic to all process compartments is that
compartments have a formation phase, a stability phase, and a dissolution
phase. The localization phases are reflected in an assembly and disassembly
phase in the tri-level architecture. The process phase is reflected in the
stability produced by the system image.
While stable, the control is more likely to have an internal origin.
During aggregation and dissolution, the origin of control is external and
internal.
The formation and dissolution phases have not been fully described in
the research literature, except as a singularity in the classical formulation
of its thermodynamics. Like our view of our own finite life, we have a tendency
to assume that the stable phase of a compartment is existent from a distant past
to a distant future. We can examine this assumption by reflection on the
Minkowski time cone.
The classical time cone, called the Minkowski time cone, is formed by
intersecting two symmetric cones in such a way as to overlap only on a
non-empty set of dimension zero and aligning both cones to a single axis of
symmetry. Choose a direction for the flow of time and associate each line, that
is fully contained in the union of the cones and that contains the single point
of intersection, with the trajectory in state space of a set of observables for
a particular system. This geometric exercise produces a linear model of the
sequential nature of present moments, with a unique relationship between the
past and the future. This time cone assumes a Laplacian world. The Laplacian worldview assumes that future
event structure is fully predicable from one set of initial conditions. The world would be fully deterministic. It does not account for non-locality in
space or time. There are no singularities within this notion, except at points
of infinity.
When a compartment is stable, and its rules of evaluation constant then
we should expect an illusion within the compartment endophysics that looks like
the Minkowski time cone, and yet the exophysics shows a birth or death event.
Under special circumstances, compartments develop an ability to perceive
outside of self and thus to interact in a meaningful fashion with other
compartments.
A compartment's stability phase is better understood than the formation
phase or the dissolution phase. Why is this, in spite of the clear evidence for
life or death events? The answer might already be suggested in the last several
paragraphs.
Stability itself enforces a single set of laws and these laws act, more
or less, in a classical fashion. The rules are not changing, or at least they
can be made sense of in a consistent fashion. Coherence establishes context.
The notion of consistency is here essential, and is related to coherence in
energy distributions - something that we will learn more about later on in this
book. We have already discussed, very briefly, holonomic theory depends on
phase coherence. The stability of the compartment comes from phase coherence
similar to that which produces light wave phase coherence in lasers.
Conditions on the total energy, and the interchange between potential
and kinetic energy, are clearly the primary constraints that are placed on all
compartments during its stability phase. Beyond this energy conservation rule,
we know many general principles also applies to compartments. For example, the
intrinsic observables of a compartment embedded within a biological system will
reflect an "internal" representation of its degrees of freedom. In
living systems, this representation is not as simple as that of a system of
coupled pendulums, which are closed to interactions with the affordances of the
environment. As a general property, the sum of energy transfer in and out of a
compartment boundary remains almost constant, at least when measured over the
natural frequency of the compartment. So the complexity of a stable autonomous
system provides a barrier to understanding conditions and principles that lie
outside the autonomous system.
We feel that the issue of what stability is, in a specific case, can be
traced to the first causes that were present during the creation of the
compartment. We can conceive that the compartment itself has formed from an
election of degrees of freedom, or observables from some larger set of potential
nominations. Once formed, the observables become that which we can see. We may think that only energy and matter are
involved in the formation of life, and know nothing about anything else. We
cannot see these other things, and we know that many mistakes where made by
primitive science regarding the unseen.
It is easy to adopt the notion that what we can directly see is all
that there is. However, what can be seen does not explain the world we live in.
Steward Kaufman’s model of the autocatalytic sets demonstrates the emergence of
specific patterns of activity that establish themselves when a parameter of
average connectivity crosses a threshold (Kaufman, 1997). This work,
on emergent computing and evolutionary programming, establishes the potential
for a science of complex systems that accounts for the unseen.
We may hope the logic of common reasoning and the dynamical entailment
of initial conditions in a mechanical system are established by the same
formative mechanisms. From the combinatorial span of the consequences of these
mechanisms a process compartment defines a stable state space where all chains
of events must lie, or the compartment itself must die.
We may hope that the forces that shape emergence are consistent with
the processes that form some valid viewpoint within the mind of humans. Thus,
if our theory is deep enough and our power to observe clear enough, then
mankind may hope to develop situational logics and logical entities that allow
us to really get a grasp on the dynamics of any particular complex system, and
to represent this dynamic in a way that is consistent with human
reasoning.
In the tri-level knowledge management architecture the stability in question
has to do with the stability of interpretation of sign systems and information.
This interpretation involves the aggregation of exemplars into memory elements
as a means to relate sign systems to past examples of interpretations of
similar information (see Chapter 12). This is managed via voting procedures
(Appendix A) and Mill's logic (Chapter 6).
A mathematical model of dissipative system, within a stratified
relationship, is advanced in the next chapters. The author believes that this
model is categorically invariant to Russian quasi-axiomatic theory and thus has
implications to the text understanding and knowledge management portion of the
later chapters of this book. The computer implementations has some surprises,
both because of the simplicity of the key innovation, the key-less hash table,
that the dependencies between a new discrete formalism and continuum models of
dynamical representations of human knowledge, as expressed in patterns of
co-occurrence of terms. The new
discrete formalism is called Ontology referential bases, or “Orbs”. Differential and formative ontology is
demonstrated and illustrations are given as to how these theories and
technologies will reshape information science.
What we get, in differential ontology, is a categorical invariance
between two formal systems, one discrete and computer implemented and the other
continuous and represented by Hilbert space mathematics. This works builds on one half of the
author’s 1988 PhD thesis, “Mathematical Models of Intelligence in Biological
Systems”. The claim is that work in one
system will transfer to the other system. Whereas this transfer back and forth
between formal analysis can be taxing to someone not aware of the basis for the
transfer, it never the less may produce a computational implementation, as a
background process, useful to distributed knowledge management. The homology between discrete and continuum
models allows a direct path to encoding data based into electromagnetic
spectrum.
Will this approach lead to a unified theory for complex systems? We think so. Much of the work in complex system is incomplete.
One way to a complete the theory of process compartments is to combine
the work on coupled oscillators (Hoppensteadt, 1986; Kowalski et
al., 1989; 1993), and
neural networks (Grossberg, 1985), with quantum
neurodynamics (Pribram, 1973; 1991; 1993; 1994).
Moreover, the notion of categorical invariance between Russian quasi-axiomatic
theory and stratified dissipative system can be worked out quickly. This gives
us a direct path to refine the data mining processing in the tri-level
architecture in such a way as to ground Knowledge Management and Complexity
Science in both mathematics and experimental life sciences.
However, some central problems remain.
Compartments are embedded. When created, a compartment separates
phenomena that once had a direct interface. When a compartment has been
annihilated, two levels of phenomena that were separated now interact. Once
created, input to a compartment conforms to the new compartment's degrees of
freedom.
The result of transformations in energy distributions is modeled as the
action of a transformation on a vector. The computation of transform
coefficients are presumed to be accomplished by fast adaptation of structural
constraints, like dendrite spine shape, protein conformation, or neurochemical
agents reflecting neurotransmitter concentration. This must be done with the
flexibility required to select from multiple (optional) responses in ambiguous
situations (see Pribram 1991, pp 264-265).
The ubiquitous phenomenon of response degeneracy demonstrates a deeper problem. At a conference hosted by Daniel Levine in
1991 on optimality, the author discussed the rationale for regarding response
degeneracy (see Edelman, 1987) as complementary
to optimality (Prueitt, 1997). We expressed
the view that the creation of viable options would be complementary to
optimality if all but a few selective response potentials are somehow held
ineligible for deterministic evolution toward a local basin of attraction. This
viewpoint provides the critical introduction of non-computational processes
into the theory of process compartments.
The challenge here is to extend and modify who science defines
mathematics and how mathematics defines science.
The viewpoint expressed is properly complex, but not overly
complicated. By this, we claim a new viewpoint that accounts for the ontology
of physical processes in a minimal way. We find the optimal solutions to old
problems by by-passing the very means in which these problems were
defined.
Consider the mind-body problem.
As suggested by Sir John Eccles (1993), intentionality could be
expressed during brief non-linear restructural transitions of process
compartments. Eccles points out that the geometrical arrangement of synaptic
boutons supports a femto-second process controlling the release of the
transmitter vesicle mediated by Ca+ influxes. A process selected through
evolution to conserve neurotransmitter provides the control. The process, that
Eccles describes, has the effect of creating a homogeneous probability
distribution measuring the likelihood that any one of six boutons will carry
the gradient field interchanges between the presynaptic and post-synaptic
events. Eccles sees this mechanism as the interface in which his notion of the
mind couples to his notion of brain by changing the probability and timing of
synaptic events.
Pribram and Eccles agree that synaptic events play a role in the
formation of network connections at a higher level of organization. At critical locations, even distribution
ratios are maintained while a distributed energy gradient increases to a high
level. Symmetry and increased gradients result in a barrier and thus in the
formation of a trigger. The trigger is released at the point where
self-organization can be effectively influenced (Eccles, 1993) by other
integrated process compartments. The trigger then closes the process that
Pribram identifies as a "temporal lens" and results in awareness.
Hameroff has identified similar symmetry generating mechanisms in the
geometric structures of the microtubulin assembly as well as in the temporal
dynamics of microtubulin formation. Microtubulin play an important role in cell
mitosis and may control some aspect of the connectivity of neuronal ensembles
through the fine alteration of dendrite arboration as well as influence second
messenger cascades guiding long term potentiation (Hameroff, 1987).
Pribram has investigated other mechanisms that might be shown to be the
active mechanism of extending non-locality effects, seen in Bell's
inequalities, to macro events involving ordered water and superconductivity in
conformational propagation of structure along the surface of neuron
membrane. Since slower processes are
required to conform to the oscillation frequency of the emergent process
manifold, the initial biases created during symmetry induction are structurally
reflected in a system image. The longer the symmetry is maintained the more
completely the system competence is sampled.
We have some work to do to bring the viewpoint together. For examples, the notion of human
self-image, although not well defined within a scientific literature (Bandura, 1978), can be used
as metaphor for an higher order "agent" that mediates the formation
of compartments and shapes the compartments' evolution. The metaphor suggests
that compartments involved in the transformation of stimulus traces are shaped
by cross scale interaction like that modeled mathematically, as temporally
stratified coupled dissipative systems in section one of this chapter.
Transformational realism interprets neuro-psychological evidence in the
context of the process compartment hypotheses. This form of realism can be
stated in the following fashion. Simple combinatorics of system competence
express elements of a behavioral repertoire. The longer the symmetry the more
completely the consequences of action in the external world is sampled. The
result is a convolution of two scales of observation resulting in a new set of
observables. The convolution, however, can have a kernel that biases outcome
and thus differentially delineates the creation and initial conditions of
emergent compartments. This kernel is the system image.
Thus, the human self-image itself needs not be a non-material mind/body
interface as perceived by Eccles (1993). The image
merely occupies a band of energy distribution. The bands are separated by gaps
in the distribution. Event exchanges between energy levels could alter
probabilities during phase transitions of compartments whose existence is brief
when compared to the agent and thus have many of the same properties as
envisioned by Eccles. Of course, this type of interface may be more complex
than the simple cross scale interactions indicated by computer simulations of
dissipative systems.
Perhaps we could say that the primary distinction between a coupled
dissipative system without system-images and a system with self-images is that
self-image is a phenomenon that interacts intentionally in a probability space.
This intentionality could be in opposition to the local laws of dynamics. The
induction of a symmetric barrier to the expression of lawful processes would be
evidence that space/time non-locality is involved in the expression of human
self-image.
We have suggested that compartments form through the lens of a temporal
coherence. These systems stratified into numerous levels and produce
compartmentalized energy manifolds (PCH). Process compartments, and not merely
networks of neurons, are the prime candidates for the proximal causal
mechanisms producing behavior. This view is consistent with those expressed in
Changeux and Dehaene (1989):
"A given function (including a cognitive one) may be assigned to a
given level of organization and, in our view, can in no way be considered to be
autonomous. Functions obviously obey the laws of the underlying level but also
display, importantly, clear-cut dependence on the higher levels. At any level
of the nervous system, multiple feedback loops are known to create reentrant
mechanisms and to make possible higher-order regulations between the
levels" (pgs. 71-72).
At all levels, in anatomical regions and across time scales, generic
mechanisms appear to operate. Complex models of prefrontal cortex interaction
(Levine & Prueitt, 1989; Levine et
al., 1991) with other
cortical systems and with limbic systems, require a formal model of intentional
processes (Rosen, 1985; Kugler et
al., 1990) that rely on
these mechanisms.
The theory of process compartments provides a framework to integrate
models of processes operating at different time scales, as well as clarify the
natural role for structural constraints in signal production and interpretation
within and between levels of organization (Pattee, 1972). Stratified
processing within and between transient compartments can then be seen in
ecological terms.
From this theory we may ground the foundation of machine mediated
knowledge management in a science integrated from open complex systems theory,
neuropsychology and computational science.
[1] The key-less hash table is derived from the Gruenwald patent where by alpha numeric “letters” are treated as base 64 digits, and the ordering native to the integers is used as a means to eliminate the hash function. See Appendix B.
[2] Web Ontology Language (acronym “OWL”) is a standard produced by the organization W3C. URL: www.w3c.org.
[3] The key to understanding the differences between W3C and OASIS standards is an understanding of the phenomenon of coherence. As in the development of personal knowledge, the coherence of a viewpoint need not be rational when viewed from a different viewpoint. W3C establishes an extreme assertion that there is only one coherent viewpoint, and that is the one it creates by asserting a unique meaning for individual terms or phrases. In addition to this assertion of unique meaning, the W3C standards are directed to impose first order logics, description logics and determinism on any model of social or business processes. This assertion is consistent with scientific reductionism, but is inconsistent with the multi-cultural foundation to the American culture. The current struggle, in mid 2006, is to conform OASIS to the W3C view of natural process. This conformation is partially being achieved via web service modeling language standards proposed to OASIS by a number of industrial powers, including DERI in Germany. Web service modeling language makes a number of useful contributions, but it does not acknowledge the limitations of formalism, nor of the de facto control over information design being conveyed to knowledge engineers by these standards.
[4] Robert Rosen: URL: http://en.wikipedia.org/wiki/Robert_Rosen
[5] Hilbert spaces forms a branch of modern mathematics: URL: http://en.wikipedia.org/wiki/Hilbert_space
[6] Prueitt, Paul S. : PhD (University of Texas at Arlington, 1988) Thesis on homology mappings modeling biological systems exhibiting learning, in particular plant and animal immune systems The thesis had two parts, the other part was on neural models of selective attention, following the experimental work by Karl Pribram and using the embedding field theory produced by Steven Grossberg.
[7] This definition of complexity following the work on category theory and bio-mathematics developed by Robert Rosen. In this sense of the work, “complexity” cannot be “computational”.
[8] Pribram, Karl. See Appendix “Brain and Perception” 1991 ERA
[9] See also Appendix B.
[10] See the work on gene and cell expression web ontology at www.biopax.org
[11] OASIS BCM URL: www.busineecentricmethodology.com
[12] Prueitt was appointed Research Professor in the Physics Department at Georgetown University from September 1991 to May 1993.
[13]
Levine, D. & Prueitt, P.S. (1989.) Modeling
Some Effects of Frontal Lobe Damage - Novelty and Preservation, Neural
Networks, 2, 103-116.
Levine D; Parks, R.; & Prueitt, P. S. (1993.) Methodological and Theoretical Issues in Neural Network Models of Frontal Cognitive Functions. International Journal of Neuroscience 72 209-233.
[14]
Grossberg, Stephen. (1972a). A neural theory of
punishment and avoidance. II. Qualitative theory. Mathematical Biosciences, 15,
pp. 39-67.
Grossberg, Stephen. (1972b). A neural theory of punishment and avoidance. II. Quantitative theory. Mathematical Biosciences, 15, pp. 253-285
[15] The author meet John Eccles at a conference hosted by Karl Pribram in 1994.
[16] Prigogine, I. (1996?) “End of Certainty”
[17]
Kowalski, J. ; Ansari, A. ; Prueitt, P. ; Dawes,
R. and Gross, G. (1988.) On Synchronization and Phase Locking in Strongly
Coupled Systems of Planar Rotators. Complex Systems 2, 441-462.
Kowalski, J., Labert, G., Rhoades, B, & Gross,
G. (1992). Neuronal Networks With Spontaneous, Correlated Bursting Activity:
Theory and Simulations. Neural Networks, 5, 5, 805 - 822.
[18] The URL is: www.businesscentricmethodology.com