Knowledge Technologies and the Asymmetric Threat

 

 

Paul S. Prueitt, PhD

Research Professor

Cyber Security Policy & Research Center

George Washington University

3/16/03

(revised on 10/12/03)

 

 

 

 

 

 

Note on Artificial Intelligence

Note on Stratification Theory

Note on Urgency

Note on Privacy Issues

Positive social and economic consequences

Note on Language and Linguistics

In Summary


 

Knowledge Technologies and the Asymmetric Threat

 

New methodology will soon allow the continuous measurement of world wide social discourse.  The measurement can be both transparent and have built-in protection for individual privacy.  What has been missing is an integration of the best natural science and computer science.

 

Currently a number of web spider systems provide experimental instrumentation for the real-time measurement of social discourse.  This experimental instrumentation provides the data that could drive the development of high fidelity knowledge acquisition technology.  A non-classified archive exists (at INSCOM) dating from at least October 2001.  Natural language processing and machine-ontology construction could provide a representation of social discourse occurring from this point to the present.

 

 

Figure 1: Experimental system producing polling like output (November 2002) from government based web harvest

 

Two different levels of organization provide built-in protection for privacy from the ground up.  At one level is a stream of data, most of which is processed into abstractions about invariances in linguistic variation.  At the other level is an archive of invariance types and compositional rules related to how grammar is used and actually observed in the construction of social meaning.  Between this grammatical layer and the real time data stream can be placed Constitutional restrictions that require judicial review before individual data elements are investigated. 

 

Put simply, linguists and social scientists have developed a real time and evolving model of how languages are used to express intention.  This model is stratified to represent natural organizational processes. 

 

General systems properties and social issues involved in the adoption of stratified methodology are outlined in this paper.  Technical issues related to logic, mathematics and natural science are touched on briefly.  A full treatment requires an extensive background in mathematics, logic, computer theory and human factors theory.  This technical treatment is open and public and is being published by a not for profit science association, the Behavioral Computational Neuroscience Group Inc (BCNGroup.org), located in Chantilly Virginia. 

 

Community building and community transformation have always involved complex processes that are instantiated from the interactions of humans in the form of social discourse.  Ubiquitous community transformation must be involved in responding to asymmetric threats.  Informational transparency is needed to facilitate the social response to the causes of these threats.  The American public must examine the nature of those inhibitions to informational transparency in greater detail.

 

Knowledge management models involve components that are structured around lessons learned and lessons encoded into long-term educational processes.  A new educational curriculum, supporting the Knowledge Sciences, must be created if the knowledge of the structure of social discourse is to have high fidelity when viewed by the average citizen.  The necessary community transformation requires a more complete and mature understanding by the public of the issues.

 

The coupling between measurement and positive action has to be public and transparent, not simply as a matter of public policy but as a matter of public trust.  Otherwise the fidelity of information will be subject to narrow interpretation and, more often than not, to false sense making.  Information without knowledge can become propaganda and subject to control for narrow interests.

 

As a precursor to our present circumstance, for example, the Business Process Reengineering (BPR) methodologies provide for AS-IS models and TO-BE frameworks.  But often these methodologies did not work well because the AS-IS model did not have high fidelity to the nature and causes of the enterprise.  Over the past several decades, additional various knowledge management disciplines have been developed and taught.  We conjecture that these knowledge management disciplines have seen limited success due to a systemic failure in attempts to model complex social processes. 

 

In knowledge management practices there are deficits in scholarship and methodology that are due to a type of memetic (definition: “memetic” - having to do with the expression of concepts in social systems) shallowness and to the intellectual requirements imposed on understanding issues related to a stratification and encapsulazation of individual and social intention.  The shallowness of the discipline of “knowledge management” might be understood as rooted in an economic fundamentalism that will not accept that human knowledge is on a commodity.

 

The New War calls, on us, to express maturity on issues of viewpoint and truth finding and to reject of all forms of fundamentalism, including those within our own society.  This maturity is needed because our response to terrorism’s fundamentalism has had a tendency to engage forms of ideological fundamentalism within our own society.  American has a strong multi-cultural identity, as well as a treasured political renewal mechanism.  When challenged by fundamentalism we rise to the challenge.  In this case, we are called on to reinforce multi-culturalism. 

 

One can make the argument that something is missing from those technologies that are being acquired for intelligence vetting within the military defense system.  Clearly biodefense information awareness requires much more that what DARPA’s Total Information Awareness (TIA) programs contemplated in 2001 and 2002The complex interior of the individual human is largely unaccounted for in the US government’s first attempts at measuring the thematic structure of social discourse.  But the individual is where demand for social reality has its primary origin. 

 

What is missing is the available science on social expression and individual experience.  Funding is not applied, as yet, to this science because of the narrowness of the current procurement processes that, not surprisingly, are focused on near term issues related to continuing funding for corporate incumbents.  There is then, a bootstrap that is needed to shift the focus from methodology and philosophy that is not sophisticated and is not accounting for social complexity. 

 

Why individual variation in response patterns, for example, is not being accommodated by commercial information technology, e.g. commercial advertising, is due to many factors.  Some of these are technical issues.  Many of these issues are related to the commercialization of information production and the increasing control over information by business processes.  This increasing control over information is seen, by the advertising business, as somehow essential to economic health.  This is, we feel, wrong minded.

 

A very large part of the Gross National Product is expended in advertising, often in ways that are unwanted, unnecessary and deceptive.  Advertising has become a disease, with a well-developed immunology that uses a false sense of American patriotism and religious membership to punish those who question the degree to which advertising controls our social construction.  It is quite easy to lie about the social reality caused by television programming and the advertising industry.

 

But a deeper problem is related to the nature of formal systems.  Science and mathematics is improperly used to prop up a theory of social construction that is not tenable.  This social construction has controlled science, causing confusion and dysfunction within the Academia. There is no clear guidance coming from the academia.  But the issues have been laid out within a scholarly literature. 

 

Scholarship informs us that natural language is NOT a formal system.  One would suspect that even children already know this about natural language.  Perhaps the problem is in our cultural understanding of the best traditions in mathematics and science.  These best traditions do not advertise capabilities that are not present, and close off debate and analysis.  Rather these best traditions remain open to correction and express a willingness to experience first hand.

 

Yes, abstraction is used in spoken language; but a reliance on gesture and other forms of non-verbal expression helps to bring the interpretation of meaning within a social discussion, as it occurs and is experienced.  So the abstraction involved in language is grounded in circumstances.  These circumstances are experienced as part of the process of living.  The experience relies on one’s being in the world as a living system with awareness of self.  This experience is not an abstraction.   Again, even children already understand this.  One has to experience truth for itself and not allow others to “sell” truth using manipulation and addictive processes.  

 

Written language extends a capability to point at the non-abstract, using language signs, to what is NOT said but is experienced.  Human social interaction has evolved to support the level of understanding that is needed for living humans, within culture, to form social constructs.  We use language signs to point at what is NOT said.  But computer based information systems have so far failed to fully account for human tacit knowledge, even though computer networks now support billons of individual human communicative acts, per day, via e-mail and collaborative environments. 

 

So, one observes a mismatch between human social interaction and computers. 

 

How is the mismatch to be understood?  

 

We suggest that the problem is properly understood in the light of a specific form of complexity theory. 

 

An evolution of natural science is moving in the direction of a stratification of formal systems using complexity theory.  A number of open questions face this evolution, including the re-examination of notions of non-finite, the notion of an axiom, and the development of the understanding of human induction.  Induction, in counter position to deduction, is seen as a means to "step away from" the formal system so as to observe the real world directly. 

 

We do have an alternative to axiomatic formalization that results in a fixed model.  It is via this modification of constructs lying within the foundations of logic and mathematics. The alternative is in knowledgeable education that depends on an appeal to direct experience.  We suggest that an extension of the field of mathematics and computer science is in order that accounts for the complexity of natural systems.


 

 

Note on Artificial Intelligence

 

The “artificial intelligence” failure can be viewed, and often is, as simply because humans have not yet understood how to develop the right types of computer programs.  This viewpoint is an important viewpoint that has lead to interesting work on computer representation of human and social knowledge. 

 

But put quite simply, a representation of knowledge is an abstraction and does not have the physical nature required to be an “experience” of knowledge. 

 

The fact that humans experience knowledge so easily may lead us to expect that knowledge can be experienced by an abstraction.  And we may even forget that the computer program, running on hardware, is doing what it is doing based on a machine reproduction of abstract states.  These machine states are Markovian, a mere mathematical formalism.  By this, we mean that the states have no dependency on the past or the future; except as specified in the abstractions that the state is an instantiation of.

 

There is no dependency on the laws of physics either, except as encoded into other abstractions.  

 

This fact separates computer science and natural science. 


 

Note on Stratification Theory

 

The tri-level architecture models the relationship between memory of the past, and awareness of the present, and the anticipation of the future.  However, once this machine architecture is in place, we still will be working with abstraction and not a physical realization of (human) memory or anticipation. 

 

Stratification seems to matter, and may help on issues of consistency and completeness, the Godel issues in formal foundations to logic.  The clean separation of memory functions and anticipatory functions allows one to bring the experimental neuroscience and the cognitive science into play. 

 

The measurement of the physical world results in abstraction.  The measurement of invariance produces a finite class of categorical Abstraction (cA), which we call cA atoms.  cA atoms have relationships that are expressed together in patterns and these patterns are then expressed in correspondence to some aspects of the measured events.  The cA atoms are the building blocks of events, or at least the abstract class that can be developed by looking at many instances of events of various types. 

 

Anticipation is then regarded as expressed in event Chemistries (eC) and these chemistries are encoded in a quite different type of abstraction similar in nature to natural language grammar. 

 

We are arguing that the development of categorical abstraction and the viewing of abstract models of social events, called “event chemistry”, are essential to national security.  We are arguing that the current procurement process is not looking at and is not nurturing the types of science that is needed in this case.  

 

Response mechanisms to these threats must start with proper and clear intelligence about event structures expressed in the social world.  Because computers cannot alone provide proper and clear intelligence, human sharing of tacit knowledge must lie at the foundation of these response mechanisms. 

 

The technology we propose is based on class:object informational pairing to produce atoms for situational logics.

 

But transparent human knowledge sharing technology is not part of the culture within the intelligence communities.  Compounding the cultural problems in our intelligence community, the current funded and deployed computer science; with its artificial intelligence and machine based first order predicate logic is confused about the nature of natural complexity.  Within this confusion, over the nature of information, a culture of profiteering on the New War can be clearly seen at DARPA of example, and may itself become an indirect threat to the participatory democracy. 

 

Current computer science talks about a “formal complexity” because natural complexity has a nature that is not addressable as an abstraction expressible as first order predicate logic.  Formal complexity is just overly complicated, and it is complicated because of the incorrectness that is imposed by the tacit assumption that the world is little more than something captured by a confused abstraction. 

 

Putting artificial intelligence in context is vital if we are to push the cognitive load back onto human specialists where both cognition and perception can guide the sense making activity.  Computer science has made many positive contributions within the context of a myth based on a strong form of scientific reductionism.  This myth is that the real world can be reduced, in every aspect, to the abstraction that is the formal system that computer science is instantiating as a computer program.   Natural science is clear in rejecting this myth. 

 

Understanding the difference between computer-mediated knowledge exchanges and human discourse in the “natural” setting is critically important.  One of our challenges is due to advances in warfare capabilities, including the existence of weapons of mass destruction.  Another obvious challenge is due to the existence of the Internet and other forms of communication systems.  Economic globalization and the distribution of goods and services presents yet another set of challenges.  If the world social system is to be healthy, it is necessary that these security issues be managed.


 

Note on Urgency

 

We have no choice but to develop a transparency about the environmental, genetic, economic and social processes.  Human technology is now simply too powerful and too intrusive to allow simple economic and social processes to exercise fundamentalism in various and separate ways. 

 

If we are to know who and where we are fighting, event models related to the onset of a terrorist behavior must be derived from the data mining of global social discourse.  But the science to do this has NOT been developed as yet.  We must define a science that has deep roots in legal theory, category theory, logic, and the natural sciences. 

 

This is how the New War is won, not by accelerating the global arms race.  Accelerating the global arms race is a stated purpose of large DoD contractors, one simply has to check the public record.  They would create a world that is not the type of world that we envision.  Alternatives exist, but the sword must be turned into a plow. 

 

A secrete government project like the DARPA proposed TIA (Total Information Awareness) project is not a proper response to challenges in information science. 

 

In any case, the American democracy is resilient enough to conduct proper science and to develop the knowledge technologies, required to win the New War, in the public view.  The BCNGroup.org is calling for a Manhattan-type project to establish the academic foundation for the knowledge sciences.

 

The current natural security requirements demand that this science be synthesized quickly from the available scholarship.  Within this new science, stratified logics will compute event abstractions, at one scale of observation and event atom abstractions at a second scale of observation. The atom abstractions are themselves to be derived from polling and data mining processes in order to create the abstractions. 

 

Again, we stress that the science needed has not been developed.  But there is a wealth of scholarship that can be integrated quickly if only there was a small effort and Presidential leadership.


 

Note on Privacy Issues

 

A stratification of information can be made into two layers of analysis.

 

The first layer is the set of individual polling results or the individual text placed into social discourse.  In real time, and as trended over time, categorical abstraction is developed based on the repeated patterns within word structure.  Polling methodology and machine learning algorithms are used. 

 

The second layer is a derived aggregation of patterns that are analyzed to infer the behavior of social collectives and to represent the thematic structure of opinions.  Drilling down into the specific information about, or from, specific individuals will require government analysts to make a conscious step and thus the very act of drilling down from the abstract layer to the specific informational layer is an enforceable legal barrier that stands in protection of Constitutional Rights.

 

New science/technology is needed to “see” events that lead to or support terrorism.  Data mining is a start, as are the pattern recognition systems that have been developed.  But we also need data synthesis into information, and a reification process that knows the importance of human-in-the-loop perception and feedback. 

 

To control the computer mining and synthesis processes, we need something that stands in for natural language.  Linguistic theory tells us that language use is not reducible to the algorithms expressed in computer science.  But if “computers” are to be a mediator of social discourse, must not the type of knowledge representation be more structured than human language?  What can we do?

 

The issues of knowledge evocation and encoding of knowledge representation shape this most critical of inquiries. 

 

According to our viewpoint, the computer does not, and cannot, have tacit knowledge to disambiguate natural language, in spite of several decades of effort to create knowledge technologies that have “common sense”.  Based on principled argument a community of natural scientists has argued that the computer will not have tacit knowledge; ever. 

 

The New War presents daunting challenges that cannot be addressed using anything we have developed within information technology and computer science.  Asymmetric threats are organizing distributed communities to attack the vulnerabilities of our economic and political systems.

 

We propose a new foundation to information technology based on class:object informational pairing with categoricalAbstraction (cA) and eventChemistry (eC) processes.  We propose a new science of knowledge systems.

 

The new operational technology is based on operational class:object informational pairing .  A simple and well-defined extension to this operational technology supports a Differential Ontology Framework (DOF) that has an open loop architecture showing critical dependency on human sensory and cognitive acuity.  An Appendix to this paper discusses the DOF.


 

Social and economic consequences

 

There are many positive social and economic consequences to knowledge technology. 

 

A large number of social organizations have organically developed around the economic value of one to many communications systems.  These systems are held into position by television and media institutions that have largely abrogated social responsibility in the name of corporate profits.   How the media corporations have done this is something that must eventually be understood and accommodated for by the American public.  We conjecture that these corporations purposefully nurture social acceptance of addictions to shallow exploitation involving sexual, horror and violence themes. 

 

Developing agility and fidelity to our information systems is the strongest defense against asymmetric threats. The differential ontology framework may enable processes, which have one to many structural coupling, to make a transition to a many to many technology.  The asymmetric threat is using many to one activity, loosely organized by the hijacking of various religions to serve the expression of private hatred and grief.  The defense to this threat is the development of many to many communication systems.

 

The many to many technologies allow relief from the stealth that asymmetric threats are depending on.  The relief comes when machine ontology is used as a means to represent, in the abstract, the social discourse.  This representation can be done via the development and algorithmic interaction of human structured knowledge artifacts. 

 

The evolution of user self-structuring of knowledge artifacts in knowledge ecosystems must be validated by community perception of that structure.  In this way the interests of communities is established through a private to public vetting of perception.  Without protection for privacy built in the technology, and protected by law, this validation cannot be successful and the technology will fail to have fidelity to what is actually the true structure of social discourse. 

 

Knowledge validation occurs as private tacit knowledge becomes public. 

 

The anticipated relief from the asymmetric threat will evolve because community structure is facilitated. 

 

The validation of artifacts leads to structured community knowledge production processes and these processes differentiate into economic processes.  To achieve these benefits, a careful dance over the issues of privacy and justice is required.  But global repression of all communities that feel injustice is not consistent with the strength of the American people.  Our strength is in our multi-culturalism and our Constitution not in our fundamentalisms.  Our strength has been in compassion and action based on compassion.

 

Individual humans, small coherent social units, and business ecosystems are all properly regarded as complex systems embedded in other complex systems.   Understanding how events unfold in this environment has not been easy.   But the New War requires that science devote attention to standing up information production systems that are transparency and are many to many. 

 

Relational databases and artificial intelligence has been a good first effort, but more is demanded.  The current IT standards often ignores certain difficult aspects of the complex environment and attempts to:

 

1)     Navigate between models and the perceived ideal system state, or

 

2)     Construct models with an anticipation of process engineering and change management bridging the difference between the model and reality.

 

The new knowledge science changes this dynamic by allowing individuals to add and subtract from a common knowledge base expressed within a differential ontology framework. 

 

This technology can remain transparent. 

 


 

Note on Language and Linguistics

 

Language and linguistics are relevant to our work for three reasons. 

 

First, the new knowledge technologies are an extension to natural spoken languages.  The technology reveals itself within a community as a new form of social communication. 

 

Second, we are achieving the establishment of knowledge ecosystems using peer-to-peer ontology streaming.  Natural language and the ontologies serve a similar purpose.  However the ontologies are specialized around virtual communities existing within an Internet culture.  Thus ontology streaming represents an extension of the phenomenon of naturally occurring language.

 

Third, the terminology used in various disciplines is often not adequate for interdisciplinary discussion.  Thus we reach into certain schools of science, into economic theory and into business practices to find bridges between these disciplines.  This work on interdisciplinary terminology is kept in the background, as there are many difficult challenges that remain not properly addressed. To assist in understanding this issue, general systems theory is useful.

 

These issues are in a context.  Within this context, we make a distinction between computer computation, language systems, and human knowledge events. The distinction opens the door to certain deep theories about the nature of human thought.

 

Within existing scholarly literatures one can ground a formal notation defining data structures that store and allow the manipulation of topical taxonomies and related resources existing within the knowledge base. 


 

In Summary

 

The differential ontology framework consists of knowledge units and auxiliary resources used in report generation and trending analysis.  The new knowledge science specifically recognizes that the human mind binds together topics of a knowledge unit.  The new knowledge science holds that the computer cannot do this binding for us.  The knowledge science reflects this reality.  The rules of how cognitive binding occurs are not captured into the data structure of the knowledge unit, as this is regarded as counter to the differential ontology framework.  The human remains central to all knowledge events, and the relationship that a human has with his or her environment is taken into account.  The individual human matters, always.

 

 

Appendix: Differential Ontology Framework