ORB Visualization
(soon)
5/7/2004 2:05 PM
Research paper by
Derek Gather -> .
Short PowerPoint
on Building Anticipatory Technology -> .
Many
individuals make substantial intellectual contributions to me, and as a consequence
to the planning for the National Project.
The planning process has been occurring since 1992. The relationship between those who have discussed this Project is
still informal. The intersection
between behavioral science, computational science and neuroscience has not been
canonized into a single discipline.
In
order for anticipatory technology to be understood, the Nation needs a few
university positions, and perhaps a few chairs in key universities. These universities are to be primarily
teaching universities, since the issue of intellectual elitism should be
addressed directly. We also need
funding that is sufficient to demonstrate what the principles of anticipatory technology
are. The BCNGroup will ask for a line
item in the Congressman budget of $60,000,000, for the National Project. We have been planning how to make this
request and getting the elements of the plan in place for a decade.
The
Behavioral, Computational Neuroscience Group (BCNGroup) members are scholars
who have little interest in intellectual property ownership issues, other than
proper scholarly attribution. Of
course, we would like to make a living and most of us struggle with this – due
to our commitment to a new way of thinking about computers and due to specific
cultural barriers. Those whom are
inclined to make contributions to a new science have to struggle with the stove
piping that exists in all aspects of academic, governance and business
systems.
I must
say that I do not speak for other individuals. I recognize in certain specific work what I regard are some of
the essential elements of the required K-12 and college curriculum. I am honored when a scholar that I have
included sends communications into the BCNGroup community glass bead
games. But at this point, it
is I who bear all of the load and responsibility.
In
consultation with others, a body of work has been outlined. We are waiting on those minimal resources
that are required to take the next step towards the National Project.
We
invite you to participate within a conversation about what the knowledge
sciences might be and how to define a K-12 plus college curriculum that
introduces our citizens to the understanding to be derived from mathematics,
computer science and the natural sciences.
The
National Project is designed to establish the knowledge sciences as an academic
discipline.
Without
a K-12 curriculum, the population will not understand the anticipatory
technology that is now being developed for use by business and government. The curriculum has to be established so that
children in rural America and children in urban America develop a perception
about how memory and anticipation work together to create just-in-time models
of some situation, such as a financial transaction or a visit to the medical
doctor.
In the
current business/intelligence deployments, anticipatory technology is only
partially achieved or achieved poorly.
As incomplete and poorly understood technology is applied by business
they will come to “know” the customer in ways that the customer cannot understand,
may often be incorrect, and will lead to resentment. The result of incomplete and poorly understood technology can be
fear and lost of control over individual privacy.
Medical
science has an even great need and a greater reward if anticipatory technology
is developed completely and the population has the educational grounding to
understand both the anticipatory science and the specifics associated to their
individual profiles.
In
addition to discussions about the proposed curriculum, we talk with various
technologists who are defining a set of protocols and encoding structures that
create a quantum step in data management systems.
Discussions
are occurring within the government community regarding an "ultra stable,
provably secure” distributed computing environment. Our feeling is that a new type of software infrastructure is a
key to the future success of the National Project.
1)
Ultra-stable, provably secure
computing environments are to be developed for reasons of National Security.
2)
The same architecture is also
what one needs to ground long term rural economic development. {return to the Jeffersonian ideal of Agrarian Society}
3)
The extension of the Rural American Safe Net
to general purpose anticipatory systems for distance learning
and for medical information systems
4)
The use of the Safe Net
wireless backbone to extend to the “last mile” anticipatory
information systems.
5)
The development of the anticipatory Web commerce
systems, such as is being discussed as a Virtual Art Museum System
Several
individuals have worked out the technical details required for this type of
ultra-stable provably secure computing environment.
The
Rural "American" Safe Net would be a politically interesting
application of such a system, if it were to exist.
Orbs built on I-RIBs
could be a "USPS light" and the complete foundation for the Knowledge Sharing Foundation
concept, as could the Pile System, an encoding and operational
system (planned), or CoreTalk.
One
could build an USPS based on a hash table management system. However, the design principles should
parallel empirically established principles mapped to behavioral aspects of
human attentional mechanisms and cognitive processing. For example, in your paper “Contextualization
Concepts using a Mathematical Generalization of the Quantum Formalism”,
Gabora and Aerts (2002), you speak about principles found by experimental
science to be reflected in a differential activation of the depth of
associative hierarchies.
The
differential activation of mental events creates more top-level concepts when a
person’s mental state is defocused.
Heightened sensitivity can lead to either diffusion of mental focus or
selective attention to a single focus.
Sometimes, a defocused state means one’s sensitivity is lowered because
there is no critical demand. So these
behavioral aspects requires in depth studies.
The presentation of Orb structure to
the user should anticipate the level of differential activation that a user is
experiencing in real time. The software
should adjust the number of elements in an upper taxonomy to take into account
the shifts in behavior of the individual.
The
neurobiological basis for human selective attention has been a favorite subject
of my PhD advisor, Daniel Levine, and I.
We can agree with the analysis you have laid out in your paper. The key point is that this work on selective
attention suggests that having a single Upper Ontology for a large enterprise
will not be optimal in all circumstances, simply because the relationships,
both the hierarchical and the non-hierarchical relationships are consistently
in a type of flux. So the discussions
at the SICoP meetings
should deepen and become clearer in regards to what is being discussed and what
the reasons are for not allowing a fuller, and
more scholarly discussion.
It is
my reading of the polylogic (Pile Systems Inc) technology that two types of
relationships allow a type of figure-ground flux to occur in a natural
way. This use of a normative set of
relationships and an associative set of relationships allows an emergent Upper
Taxonomy to develop as a normative set of relationships and then a refinement
of this normative set with some additional associative relationships. I would certainly think you would have some
comments about this.
The
importance of the polylogics approach is in
1)
creating educational material
for average people to read and understand about the anticipatory technology,
2)
having a “natural” data
encoding standard, one not developed by a committee, but one that is developed
to be optimal in terms of a deep technical requirement and in terms of
simplicity.
It is
the flux in relational structure over “items” that is one of the points you
argue very successfully in your paper.
I point
out that I have original work in mapping continuum mathematics models of
co-occurrence relationships in text to a discrete set of simple syntagmatic
units, having the form
< a, r, b >
These
continuous models can be developed using stochastic latent semantic indexing,
scatter gather, or algebraic latent semantic indexing. One of the keys to understanding my work, on
what I have called “differential and formative ontology”, is to allow a frame
to form based on actual structural relationships in text and then to allow this
frame to acquire some additional constraints, in a way that suggests elements
of polylogics and your work on conceptual space activation.
The Orb encoding, as
sets of the form { < a, r, b >
} has very natural translations between OWL, RDF, and other Semantic Web (W3C)
standards. But the Orbs also have very
simple translations into and out of CoreSystem and Pile System. Orbs support simple implementation of what
have been very advanced knowledge creation and knowledge propagation aids,
within both the Human-centric Information Production (HIP) and the machine intelligence
paradigms such as artificial neural networks. Differential and formative
ontology has implicit knowledge representation in the form of
complex mathematical theory and structure, and explicit knowledge (structural)
representation in the form of these Orb sets
{ < a, r, b
> }.
There are differences between anticipatory technology and the work being developed by Semantic Web standards committees. Single innovators, whom I have identified in my work, develop more powerful solutions to the problem in interoperability than committees. The committees work is often dragged down by the lowest common denominator, often expressed as institutional self interests.
It is thought by most that one key issue to creating cognitive aids is in finding the optimal methods and encoding structures to allow a single set of commonly agreed on structural standards. It is often NOT mentioned that this commonly agreed on standard might be something like the ordered triples where the first and last element are graph nodes and the middle element is a relationship.
What I
have learned from interactions with W3C standards committees (as illustrated in the SICoP
meetings) is that the introduction of the notion of meaning brings
with it complexity issues.
The
confusion can be resolved into a clear picture if the notions of complexity are
defined as being not resolvable in a computing device, requiring an induction
that "steps away" from the algorithms and state transitions.
So the
standards discussion might shift from the topic of upper ontologies and
micro-theories to this fundamental data structure related to a graph, and
processes that map to natural science.
A
"non-semantic web" USPS can be (easily) developed to allow an
"un-engineered" natural physical system to act as a controller at
points of complexity (tipping points). We are calling this the
machine side of an Anticipatory Web.
The AW
will have two sides, a machine side and a living side. The machine side will
separate the development of
1)
a theory of data structure invariance
derived through natural language processing and
2)
a theory of event states that
are anticipatory of normal human behaviors
The
concept of the machine side of the AW excludes the notion that semantics is
never fully resolved within the Anticipatory Web. One needs a mutual induction involving the computed states of a
computing system and the natural memory/anticipatory capability of a natural
living system.
The
operating system for this USPS involves what we are calling "participatory
function". Machine data exchanges,
what W3C is really trying to achieve, is a structural participatory function
that would, if achieved, provide 100% interoperability in data exchanges.
Sandy's solution for this is complete and optimal. Merge, in the
CoreSystem setting is instrumented at design time and involves no elements of
complexity. However, we are looking for a closer binding between the real
time experience and participatory functions of average human users.
In the
old Artificial Intelligence school of thought, “complexity”, as defined in
computer science, is misleading. The AI
and computer science use of the term “complexity” is not what we mean by
natural complexity. In natural
complexity, a true indeterminate exists due to under constraint of structural
activity relationships in real systems in real time.
As you
point out, in the case that the participatory function is
"stratified" and has an emergence of form in the context of an
environmental setting, then the ontology merge involves an entanglement.
The
notion that quantum reality is separated from "physical reality" can
be examined in spite of the obvious paradoxes.
One of
the paradoxes of quantum reality is that there are things that do not
exist. Other paradoxes exist as
well.
Some of
these obvious paradoxes can be set aside if one examines biological research
about living system with memory and anticipation. We see that reality is
as it is, and is not necessarily consistent with the old “rational” view of
science.
One can
then suggest that in some abstract sense, that the resources involved in memory
(bringing the past to the present) and anticipation (bringing the future to the
present) can be addressed in a computer system as two entirely different data
encoding forms.
So one
can see a path forward.
Invariance
of (sub)-structural forms is aggregated into non-redundant data storage using
an optimal encoding mechanism. The
anticipatory forms are developed using something like stochastic
"induction", machine learning such as Hidden Markov or various forms
of latent semantic indexing. Evolutionary algorithms can be used
also. What is to be discovered is the contextual utility functions that
fully constrain the aggregation of a bag of sub-structural forms.
The
system always remains open to the possibility of novelty.
In some
cases we allow "mutual-induction" to evoke the cognitive acuity of a
human with computer output. This is the
notion of a "Knowledge Operating System".
One is
allowed to use a Peircean language here, and talk about atoms and
compounds. A compound occurs via a measurement followed by an
interpretation of the function, in a specific environment in real time, of the
aggregation of a bag of atoms. This language is tri-level
since the interpretation lies outside the resulting compound, in the
environment of the compound. As in physical chemistry the individual
atoms are entangled and cease to exist as individual "things".
Anticipatory technology is now a reality, but is poorly understood within the intelligence and business communities.
Like nuclear power, the power of anticipatory technology is here to stay. If the population does not understand it, it will be misused. But with a National Project designed to develop a new academic discipline related to the knowledge sciences, these anticipatory technologies will lead to a renewal of the democracy.