[145]                             home                             [147]

 

Thursday, January 26, 2006

 

Challenge problem à

Generative Methodology Glass Bead Games

 

 

 

Communication to part of the SOA Blueprint Technical Committee (at OASIS)  à [144],

 

 

Extended remarks made by editing and extending my footnotes from [145]

 

I will review the W3C’s WS-CDL, but the W3C has been good at top down descriptions of what a standard should do, but has not taken the observational position that J S Mill did, that the “secrete” Soviet (four decade) project in applied semiotics did. 

 

The observational methodology that is suggested by the Second School is to use the classical Greeks interrogatives

 

 { Who, What, Where, When, Way, How }

 

to develop Blueprints.  The notion is that each Blueprint would be considered by itself (not thinking about re-use – for example) and would descriptively enumerate the essential concepts, relationship and attributes felt useful by an individual or small group.

 

A wiki system would be ideal for evoking and allowing an enumeration to take place.

 

The W3C is the First School.  There are many contributions made by this school of thought.  The Second School develops a true alternative, in the sense that the two viewpoints are likely to be seen to be incoherent if expressed together.  (Such is life.)

 

 

 

The First School rushes to continue a pretense that logic (and the term “logic” no longer has a single meaning) can be imposed on what are being called semantic models.  The First School believes that technical solutions should drive the social acceptance of standards, so that society can benefit from these standards.  (Maybe this is a bit harsh.)

 

The Second School asserts that the technical adoption should come second, and the loosely defined Blueprints should be developed without the technical adoption issues in mind.  What might happen is something called “bypass”; where once the full set of Blueprints is developed (say 50 – 100) then everyone sees a technical bypass. 

 

 

I agree that the level of confusion in the market is not simple; however in the kind of complexity theory my group represents we see complexity as the simplest explanation of certain aspects to reality. 

 

The appearance of complexity is often simply where there is more than one interpretation for the same thing.  Whereas computer science and formal systems avoid natural complexity (and actually confuses the natural meaning of this term by associating meaning to the phrase “computational complexity”) human being use natural complexity all the time in everyday life. 

 

It is the also the conclusion of my client, a Canadian firm that there is space between the Reference Model and implementation.  This is where my current effort is focused.

 

 

A domain expert should develop a blueprint, without considering the technical issues involved in implementation  (for more on this I have composed à [147] when I have rested)

 

 

According to C S Peirce and others, the notion of stratification is that there should be no relationship between the set of primitives and the set of compounds… like atoms and molecules.   This is the key to understanding stratified ontological modeling where primitives are discovered using the methodology I am proposing.  Peirce is considered to have had an implicit “unified logic vision” (ULV) which I paraphrase as

 

“A concept is like a chemical compound, it is composed of atoms. “

 

One has to understand the differences between “atoms” and “compounds”, to understand the ULV of Peirce, (it is said by some.)

 

The first step is to have a set of molecules to talk about, and then the second step is to discover what are atoms.  The epistemic gap that separates the two levels is absolutely necessary in order to address the needs for flexibility and for real time orchestration of responses that are “anticipatory” of the intention of the requestor.  This is the position that the Second School is taken.

 

Paul Prueitt

The Taos Research Institute

Taos New Mexico