The Knowledge Sharing Foundation

 

 

General Systems AS-IS Model

 

of federal procurement of advanced Information Technology

 

Authored by

Director BCNGroup

 

January 1st, 2003

 

(See also)

 

General systems theory can be applied to the process of evaluating and deploying technology in support of intelligence operations. 

 

In many competitive markets the open competition between suppliers of product works well.  But traditional commercial non-disclosure agreements and the sensitivity of real-time data limit the competition to a few trusted partners.  

 

To understand the knowledge sciences it is necessary to understand what is possible from computer based analysis and synthesis of structured and semi-structured data.  This type of innovation has had difficulty in transfer into national defense information technology because the procurement cycle is simply too long in duration. 

 

The procurement situation is exasperated by the nature of the transfer of basic innovation from academic centers, and from individual innovators, into intellectual property.  In traditional business models, vendors require IP and non-disclosure (trade secrets) as a re-requisite to creating commercial product.  But conflicting patents, and patents on algorithms (mathematics) that have significant prior published history make objective evaluation difficult.

 

Mapping the patent space surrounding computational intelligence and comparing this map to a map of the academic literatures reveals the nature of transfer issues. 

 

One can see that a few issues dominate all others in slowing the pace and reducing the quality of innovation transfer.

 

One can, and should, observe that a class of specific types of values is lost in the transfer of innovation to a product.  General systems analysis of the transfer is needed in order to understand what this class of values is, and how to adjust the procurement process so that the government may reduce the loss. 

 

The present rules of procurement take too long and are too rigid.  This process is not allowing the most relevant methods to be identified.

 

Educational material is not developed by academia so that users can hope to understand these methods.

 

Core methods are not deployed within a software infrastructure where the core systems are available in an agile and flexible fashion.

 

An mismatch has grown between intelligence communities use of vendor software and what is possible given both educational processes and software functionality that is essentially based on pure mathematics (neural networks, genetic algorithms), machine representation of taxonomy and ontology, and formal logics. 

 

This is a natural consequence of vendor control of core mathematics and methods. 

 

General systems models of the procurement process reveals that the procurement process has created a number of distinct communities.  Each of these acts to maximize narrowly defined group interest.  Most often the groups’ interests are expressed as financial compensation.  Most often groups avoid actions that benefit other groups because the capacity to compete on future proposals might be enhanced and might then restrict financial compensation from these future proposals. This is understandable and is often healthy, but also one needs to understand some of the consequences of locally expressed self-interest, with respect to the global interests.

 

Thus, something like the stratified economic theory of Nash can be found to be absent in the overall process that governs the behavior of these communities.  The global value that might develop from high quality transfer of innovation into adapted technology does not occur, due to the local interests of each competing community.  Sometimes, the correct products are not developed and when deployed do not work. 

 

No one wants to see our intelligence community fail again in managing real time information about the threats from asymmetrical warfare.  But it is not clear how to adjust community behavior in an environment where competition overwhelms collaboration almost immediately.

 

Stability has settled around a specific process that accounts well for the self-interest of the various communities, but accounts poorly for global needs of the intelligence community.  The global need is too difficult to meet, and there is too much money in the system.  The system stability has a greater propensity to maintain current behaviors than to make needed adjustments. 

 

The problems, that must be better addressed, are related to informational awareness of the real time processes that are occurring in support of asymmetrical threats.  We have to see and understand the activity of an enemy that is using every means possible to remain stealthy. 

 

The global need is for clear, agile and interoperable information systems where advanced computational intelligence (data mining) tools are un-encumbered by vendor business models and users have a liberal understanding of the limitations and features of a set of core tools. 

 

Part of elevating the process comes by understanding the way problems are entrenched, and finding a by-pass that changes the system in a positive fashion.

 

We have the possibility of a different type of deployment model that has two components

 

1) An university based educational component that provides a liberal arts understanding of the history and principles of those areas of computer science, cognitive science and general systems theory that one might suppose must inform the decision about technology evaluation and deployment. Scientists would develop a curriculum.

 

2) The core tools that this curriculum reveals will be made available within the deployment

 

 

The Knowledge Sharing Foundation