June 30, 1999
Dr Paul S Prueitt
*********
Let
me see if I can tease out some hidden issue here and relate it to both human
memory research and a new technology for corporate knowledge management. The
issue has to do with complexity.
Complexity, defined as stratified processing, comes in as a requirement
for understanding human memory and for designing next generation machine
intelligence. Future corporate
knowledge management may be enabled with an analogue to the type of memory that
is used in the development of the contents of mental awareness by the brain
system. This is to be done by accommodating a tri-level architecture into data
mining, data warehouse software.
We
first must address an enigma, concerning the relationship between memory and
non-material constructs related to generalization and categorization. For example, when one talks about a
stochastic limiting distribution, one talks about a "fiction". The
distribution is not "one particular ending version of a physical
thing". "It" is not
anything that we can physically point to and say here "it" is.
A
model of a selection event in a specific event path is a model of a specific
event. The event selection is a full
set of entailments related to the event.
An
example of such a model is a graph consisting of nodes and links with neural
dynamic programming providing internal inference of next state transitions. The model has two layers of entailments, one
localized to the nodes and links (to the description logic type computational
ontology). The other model is not localized. Neural dynamic programming uses continuum
mathematics to mirror these entailments (forces) that are not localized. The two models operate at different time
scales. [1]
The
event of a decision, selecting between choices, is not a fiction. It is a real physical process during the
period of time when it exists. The
tri-level architecture supports decision making by encoding an abstraction that
references the invariance of modeled situations, evolving a situational logic,
and placing the decision into a context with computed consequences.
But
the "the invariance across" many event paths is not something
physical. We may treat this invariance as a bag of rough partitions of
categories of invariance; deconstructing and then reconstructing models of
events. But the resulting category policies, voting procedures (used in
reconstruction) and the situational logics are fiction. This odd situation is easy to see when the
situation is pointed out, but an absence of material substance is not often
taken into account in our common discussions about modeling human
decision-making or accounting for complex behavior in social and psychological
systems. We should not forget that the
model is a fiction, but we often do.
The
"practice of mathematical thinking" fools us into treating the
referents of formal language improperly. The ontology of the mental constructs
of mathematics is not the same as the ontology of a physical thing. The model ontologies expressed in description
logics and assertion statement are fictions, with some imprecise correspondence
to natural reality. [2]
When
we are not careful we lose sight of the object-of-investigation and start
manipulating symbols within formal constructs. This is Rosen's and P. Kugler's concern. In only one of many examples, the neural network community
constantly publishes papers where the "Rosen category error" is
ubiquitous. In this literature, one
never knows whether real neurons are being talked about or merely mathematical
formalism. This is where
"formal" semantics has become a convenient deception of terms.
Perhaps
it is not possible to formalize the full character of the human interpretation
of events. Or perhaps the type of formalism we need is an extension of the
present notions of formal systems to include formal systems that are open to
change at the level of axioms.
I
believe that the later of these two possibilities shapes our present challenges
in all applications of complexity theory to practical problems. We need to extend Hilbert mathematics into a
methodology for expressing formal systems.
Without
axiomatic openness, we miss those unique aspects of a pragmatic axis that are
"rooting" in a present moment.
These
pragmatic aspects are not in any way "statistical" because they are
not an invariant across multiple event-things. Memory does not inform us about
them, direct perception (measurement) does inform us. This direct perception involves some sort of action-perception
cycle as well as a mixing of memory and anticipation. Humans do this and computers do not, up to now.
Thus
a human computer interface is needed to enable the tri-level program to acquire
(mine) judgments about situations as they arise in real time. The consequences of judgments are needed to
drive part of the manipulation of data in the tri-level data warehouse. [3]
There
is openness to the measurement of unique aspects. This openness is to be seen within a pragmatic horizon of a
present moment. The horizon must show
the kinds of paths into the future that are possible and measure some limiting
distributions, related to outcomes, in probability space. We also need openness
to fundamental changes in the measured invariance and the interpretation of new
observables. To manage this measurement
and interpretation for machine intelligence we need some way to place meaning
on a set of symbols.
The
fact that "we" have memory brings (a future) complexity science to
the table. The fact of memory requires a theory of stratified processes, and
this theory can be grounded only if one defines complexity as about strata of
self-organized worlds. Memory does not "exist" as a legitimate thing
in the world of decisions and mental awareness. It is either a part of a mental awareness of decision, hopelessly
entangled and transformed by the whole, or it is a potential that may actualize
if certain metabolic processes occurs.
Thus, we conjecture, that one simply cannot do a proper science of
memory without stratified theory.
We
cannot do a proper science of memory without stratified theory because memory
has two referents. The first is
"memory" as perceived in the contents of awareness. The second is
"encoded" memory that is not involved in any actual contents of
awareness in a present moment. It is
the encoded memory that is most like formal mathematics. The two referents cannot be thought to exist
in the same way. Metaphorically, the
difference between the type interpretations of the term "memory" is
similar to the difference between a computer simulation and an actual
event.
Memory
"exists" in a different sense in the two cases. In the first case the linguistic referent is
real. In the second case the referent is a "fiction". Statistical
artifacts and formal generalization allow humans to forget about this
difference. We make a "category
error", because the fiction is so good.
Memories
in the second sense do not exist as separate things not part of awareness. When
encoded memory is "part of" the cause of the contents of mental
awareness, then "it" has lost its original encoded nature because
"it" is now enslaved (I. Prigogine's term) and entangled (D. Bohm's
term) in the emergent whole.
In
complex natural systems levels of organization are stratified through time
irreversible emergence. Awareness of
mental contents is at a different physical level of organization that is the
metabolic processes that give us "encoded" memory. The two processes exist at the same time but
are separated by the encapsulation of the faster processes, when viewed from
the metabolic process of slower time scale. This is the theory of
stratification.
A higher-level
process must "bring something into existence", in the same way as
interactions of elementary particles brings things into existence from Bell's
beable (unobserved quantum level "states") world. When a "cross
scale" event occurs, then some unusual phenomena can actually be observed
in a scientific fashion. For example, the Bell inequality has been
experimentally confirmed to appear as an instantaneous "action at a
distance". What we regard as the normal laws of conservation of local spin
are violated. Other symmetries in conservation laws are also affected by
quantum cross level events.
Organizational
levels are separated by a gap in the sense that things in each level can only
interact (normally) with things in the same level. H. Pattee has talked about
this "epistemic gap" for several decades. My "gap" question to (quantum physicist at Georgetown
Univ) George Farre is important. My
question asks about any relationships between the quantum mechanical Heisenburg
gaps, epistemic gaps and the mind-body Cartesian gaps. It asks if there is a relationship between
Plank's constant (quantum distances in electron shells), Boltzman's constant
(fluid pressure) and the speed of light constant. These constants are the so
called "pi numbers" of physics. But my question also goes to the
question of how memory is encoded and decoded.
It asks if one can discover, in a specific instance, how many
"levels" of organization have formed correlate to mental
awareness. It asks if K. Pribram's holonomic
brain theory provides us a means to answer this question.
The
consequences of understanding this has huge consequences to medical and
psychology science, as well as to the study of social and ecological systems.
And it has an impact on architectures of machine intelligence (having the two
types of memory).
The
statistical artifact that we formalize seems to have a biological parallel, in
that memory is encoded invariance. The invariance is seen more than once in
events perceived. The story is subtle, since living memory mechanisms, in some
way, record histories of choices and features/properties of events. Memory
mechanisms are also encoding relationships within the invariance.
What
is the formalism for talking about these associative mechanisms? I believe that
the proper formalism is quasi-axiomatic theory as developed by the Russian
semiotics and completely ignored by Western institutions. See:
John
Eccles talks about the symmetries in chemical processes in the synaptic button
and gap to "cause" an induction of specific processes that link
ultimately back to an encoding mechanism. However, Eccles stresses that the
"mind-body" interface is only properly thought of as a distribution
in probability space - and thus not properly "material" or "mental/spiritual".
(He discusses this in a paper published by Pribram in one of the 1990s
Appalachian Conferences.)
The
neuropsychological and cognitive neuroscience literature has many other points
of view that seem to point to this distinction between two "kinds of
memory".
July
30, 1999.
Paul
S Prueitt
[1] Footnote made on 3/19/06, by Paul Prueitt: Please refer to the April 2006 discussion regarding service oriented architecture and the use of model ontology.
[2] Footnote made on 3/19/06, by Paul Prueitt: This becomes the Second School position as developed at
[3] Footnote made on 3/19/06, by Paul Prueitt: In architecture for doing this was developed by US Customs in December 2004, and published in late 2005
http://www.datawarehouse.com/search/?FREETXT=Prueitt