Saturday, July 01, 2006
The metaphor between gene, cell and social expression à 
On Formal verses Natural systems à 
Generative Methodology Glass Bead Games
Transcript of a Talk by Paul Prueitt
(see also )
Service Oriented Architecture
Thank you, I suppose everyone can hear me ok?
I appreciate Bob, Steven, and Marti’s presentations. I feel like we all have a very good sense of where the topic map concepts are and where topics map technology is. I would like to focus on service-oriented architecture in the presence of information structure. Specifically I will look at a high level model for using taxonomy, web ontology languages and topic maps.
Look at slide 58. Suppose that we would like to measure the real time exchange of information. Now we are all aware of why this is important in the intelligence community. We also should be aware of how useful this would be for U. S. Customs being able to know what are all of the kinds of discussions that are going on within the various communities associated with the inspection of commodity transfers worldwide. 
The next slide, slide 59, is to point out that information occurs within a community. The way I approach this is by thinking of information as a transaction space. That transaction space consists of discrete sending of a message from the message generator to the message receiver. In communities this discrete generation of messages and receiving of messages is an essential aspect of conversation. We have various means of conversation . What the semantic web is promising is to instrument a machine duplicate of the transactions that are occurring in the social discourse.
Please go to slide 60. I would like to remind us of what is information structure, to give a definition of what it is that I mean by the phrase “information structure”. What is the cast of characters that I think about when I think about information structure? There are four categorical elements of this cast.
The first  is from knowledge elicitation techniques that have been refined from the knowledge management practices over the last ten years. How does information existing in communities become reified as computer based structure?
I would like to point out that many people think of information structure as the structure in a relational database or a data model. What I am pointing out in my work, and in the second school effort, is that if the origin of information structure is the data base engineer and if that structure imposed into the relational database is not agile and flexible then we have difficulties in making that structure in the relational models reflect what is going on in the world in real time.
When we have something like the Hurricanes Rita and Katrina, then our systems fail because there is no way the represent the novelty that comes from these events in real time.
To begin to try to instrument a real time measurement of the social discourse, one can look at taxonomy and controlled vocabulary. I would prefer to use the phrase “managed vocabulary,” indicating that there are things like reconciliation of terminological differences that need to occur when you are measuring the vocabulary that is occurring as messages are sent back and forth.
We also have the resources of topic maps and web ontology language. I remember the discussions that Steven Newcomb were leading in the years 1999, 2000, 2001 where there was a struggle to define whose work was going to be the standard for representing information structure within the semantic web. We have evolved quite a bit since that time, now being 2006. 
Web ontology language has evolved tremendously and has become essentially very highly constrained finite state machines which when used by the biotechnology and bioinformatics communities become topics . The topic map paradigm is now being used as to provide a process model for the merging of web ontology language, W3C type ontologies.
Go to slide 61. This picture, of RDF (resource description framework) semantic execution environments being developed, really implies and asserts that semantics can be executed and that the semantic structure is known prior to the present moment. The second school suggests that this is not true, that in the moment there is a pragmatic axis. In that pragmatic axis lies the complete meaning of things. So the notion that we can execute semantics, in a perfect way, has been misleading.
The next slide is slide 62. This slide is reminding the audience that reality sits there. There is the observed impedance mismatch between the use and the design of information structure and the reality that we are faced on a day-to-day basis.
“What is the source of this mismatch” becomes the question. 
Is it the interface design or is it in some fundamental difficulty with the nature of relational databases, RDF and web ontology languages?
What you will see as I finish up my presentation, is that there is the appearance that topic maps are going to be utilized as mediators between taxonomy and controlled vocabulary and these web ontology languages. The web ontology languages are becoming very powerful, very reliable and the utility of these web ontology language constructions is becoming apparent to everybody, including myself.
If we go to slide 64, we have the ideal picture here. The community of practice interacts within the situation, in real time, produces and uses web services. These web services are mediated by controlled vocabularies and web ontologies.
The next side is 65. I am making the observation that OWL, which stands for web ontology language, is almost a perfect solution.  The difference between RDF and topic maps becomes very apparent. With topics maps there is specifically a consideration that human knowledge is an interpretant process. That interpretive process is much more full and substantial than the computational processes that we are experiencing when we are looking at a well-structured OWL ontology.
Please go to the next slide, slide 66. I have been proposing a semantic interpretation environment rather than a semantic execution environment. In the semantic execution environments, essentially you have the repositories and registries that we see with web service structure sitting within the service oriented architecture systems. Those are being accessed in real time, in real situations and there is a measurement process that is occurring using social network analysis, semantic extraction, service oriented architecture blueprints and choice points. That inner loop, there (in the slide), mediates the activity of the community of practice.
On slide 67 you see that the community of practice really has its own origin of control. We have this loop (in the slide) which is what Tim Berners Lee talks about as being the semantic web; i.e. the registries the repositories. There is an interaction with reality. There is a measurement and there is information structure that is being utilized. But that loop is always considered to be in the machine. Outside the machine, remember that there are the two sides of the semantic web, the second side is the community of practice.
So we want to ask the question, “where is the origin of information design?” Is it within this machine side, with the IT, information technology, experts and knowledge engineers where they are constructing these top down taxonomies and controlled vocabularies that everybody has to abide by if they are going to participate in the semantic web? Or, is there going to be a shift to the community of practice, where the community of practice is being observed in their everyday activities doing what they want to do, independent of the computer.
The next slide is 68. I am again representing that the community of practice can be put into control of the origin of design and that the origin of design of information can be shifted away from the information technology and the knowledge engineering community to the community of practice itself. This can be done using the measurement of that community of practice’s use of language.
The second to the last slide is just a stack, as it were, where I am indicating at the bottom of the stack, in some sense where the internal wheels of the semantic web exists, will be RDF based structure extended with ontology inference layer, that is OWL ontology web language. There will be inferences and all things going on there. This will be the machinery that allows things to happen fast. But there will also be the understanding that the machinery has no way to measure the world in real time. For that one needs the flexibility that topic maps gives us. 
The way that topic maps are generated, i.e. the way that new topic maps are generated, maybe will be through a semi-automated process looking at a managed vocabulary and then the conversion of managed vocabulary to taxonomies.
My last slide is to apologize for being complex and to say, “My goal is to make the complex simple.”
 My Roadmap for US Customs was completed in Jan 2005 URL”
 The transaction space assumes that all transactions are localized, and this is clearly not the case with human communication. However, the transaction space is that part of human communication that is localized. The causes of transactions is where the non-locality is primarily observed, making the future transactions not dependant on only an understanding to the present and past transactions. This observation leads to the differences between Tim Berners Lee’s notion of a Semantic Web and Paul Prueitt’s notion of an Anticipatory Web of Information Structure. URL:
 The cast of characters are each categorical elements. As the slide shows, there are the information structures produced from knowledge elicitation techniques like polls and interviews. There is taxonomy and controlled vocabulary. There are also the ontology web language constructions and the topic maps constructions.
 The “conflict” was over “reductionism”. RDF/OWL advocates claimed that there is no non-locality in the semantic space and did not recognize a pragmatic space at all. Topic maps explicitly recognized the non-locality of meaning, but had no intellectual framework powerful enough to overcome the demonstration of early utility by strong reductionism in the context of information spaces. So RDF deep framed the situation and all funding when into the RDF/OWL development.
 The question is obviously related to age old questions about the origin of conflict and misunderstanding.
 This almost perfect condition is theoretical. The easy observation is that one has to forget about many deep problems if one asserts that OWL is sufficient in the general case.
 One needs the non-locality that only humans can measure in real time.