November 9, 2006
(a proposal creating a “.vir” standard)
on the differences between AI/Knowledge-Engineering
Powerful industry and government forces, using modeling tools that have underlying assertions, are developing the concept of service-oriented architecture. These tools are UML (Unified Modeling Language) and OWL (Web Ontology Language). The assertions are discussed in the second school literature. Service oriented architecture (SOA) is defined in many books and by many IT vendors. “SOA” is now a buzzword in the IT and government circles.
Understanding that there is an alternative to UML, OWL and vendor defined SOA can be a daunting task. The “.vir” concept  and the concept of a National Project  both work to establish human centric information production. The technical and social arguments are presented in many forms in my work. In this work we attempt to define how the public information spaces can be “re-democratized”.
One of the two original authors of the OSI Topic Maps standard, Steven Newcomb makes a comment about the use of Topic Maps outside the US.  The suggestion lends support for the BCNGroup claim that the US government and US IT and media industries have formed a coherent or unified effort. The BCNGroup claim is not that this effort has been conceived out of malice, but rather is part of the confusion that results when self-interests of corporations and the interests of the American People are not properly sorted out by the government.
The effort by the IT vendors, and government agencies, is designed to ultimately provide centralized control over information about production and distribution systems, and to gain monopoly control over these systems. The context of this effort seems justified based on the claim that the United States is a republic and that the control of information by the elected leadership is necessary based on the war on terrorism and for other reasons. However, actual control over information has shifted to a number of powerful corporations.  The process can be traced back many decades, but has crystallized into a hard reality only during the presidency of George W Bush.
It is my firm belief that no direct effort can be made to stop the process of centralizing control over public and private information. The strategy is to indirectly compete with the “system” by putting into place a set of standards and a few key technologies so as to create a public ownership over information spaces.
The approach taken by the “.vir” subnet standards has a very different set of assertions that those promoted by the W3C and by US IT vendors. The foundational assertions of the W3C are shared with the far right Republican agenda, which is to establish a hierarchical control over economics and political expression.
The “.vir” assertions start with a rejection the foundational concepts of artificial intelligence and knowledge engineering disciplines. Again the blanket rejection of AI and knowledge engineering is made at two different levels. AI is considered to be not tenable based on actual scientific literatures on perception, cognition, and response.
The argument against using AI in unified attempt to control information productions is made elsewhere and is not material to our “.vir” architecture, other than to help defined what we are not advocating.
The issue with knowledge engineering is not precisely the same. In both cases, the paraphrase of A. H. Whitehead comes to mine:
“induction leaves something behind, and in some cases this is not important but in other cases what is left behind is vitally important “
In AI there is an induction of the formalism on which is then defined specific computational processes.
This formalism does not account for the full reality of living systems, and yet the claim is made that computer programs will, or can now, be endowed with natural intelligence.  The assertion is incorrect on two accounts. First because it has not been shown to be supported by natural science, in spite of the focused government funding over three or four decades. Second because the creation of such computer programs would be controlled as proprietary property.
In knowledge engineering the induction is a very different type than in classical AI. Attempts by a few research workers, and only a few individuals have both disciplines mastered; have not lead to a merging of AI with knowledge engineering. Knowledge engineering is far more related to classical software programming, where software practices are placed up front and constructions like expert systems are not used. There is very little use of genetic algorithms or artificial neural networks in AI, up to some only recently. In knowledge engineering, one has no exposure to neural architectures and no exposure to genetic algorithms.
The comments above are merely to separate the “.vir” standard from what we feel are the assertions that one accepts tacitly, we claim, when buying into either AI or knowledge engineering.
 Index to the basic “.vir” presentations are at: URL:
 Introductory pages discussing the National Project are at URL:
 It is not necessary that the notion of coherence between industries and the government have been intentional. However, part of the called for Congressional Investigations may center on the possibility that government officials have set up and were involved in meetings that had this unification as an objective. The unification may have been seen as beneficial to the governance of the United States based on the extreme views of the far right political spectrum. So, I am not suggesting that laws were broken, but that governance has moved in a direction that has been intentional but has not been transparency.
 The notion that “intelligence” can be defined is one of the routes into the mythology that AI presents to the world. The trick is to assert that any definition of natural intelligence can be used to create a program that exhibits that definition. Of course the nature of definition produces precisely the “induction” of symbols and meaning that Whitehead was referring to, in my opinion.