[222]                             home                             [224]

 

Wednesday, March 08, 2006

 

Challenge problem à

The Taos Discussion à

 

Generative Methodology Glass Bead Games

 

 

On ontological modeling of expression

 

The metaphor between gene, cell and social expression  à [217]

On Formal verses Natural systems à [206]

 

Notes to colleagues

 

You are welcome to share this message within your group. 

 

My group has application in any transaction space; as is discussed in the BCNGroup's "digital Glass Bead Games",

 

http://www.bcngroup.org/beadgames/beads.htm  including (in theory) the complex control of Hydrogen and Nano-technology measurement, manufacturing and distribution systems (future economic system). 

 

 

 

As several leading workers fully understands, the current technology key is translation methodology between any information model..

 

http://www.bcngroup.org/beadgames/generativeMethodology/186.htm

 

 

A formal theory is now established based on n-ary mediation, that can (provably) construct any fixed translation bridge. 

 

But this does not necessarily mean that the now translatable  Information Model is "correct" , nor does the translation theory by itself simplify what is not only often technically incorrect but very, excessively, complicated.  The Business Centric Methodology, like OSD's Dr Albert's Net-centric theory, is almost "HIP" (Human - centric Information Production).  HIP has a measurement and encoding component and looks at the production (and propagation) of information as being essentially a human activity, not an artificial intelligence activity.  So translation is seen as not being sufficient because the production of information is not situated (through the human-centric induction of information structure) .

 

http://www.bcngroup.org/procurementModel/to-be/di.htm

 

It might be considered that the Second School work is grounded in the natural sciences in a fashion to does suggest that a community of natural scientists (Peter Kugler, members of the Rosen forum, Gerald Edelman (Neuro-Darwinism), and others...) are able to state what has been up to this point in history an unrecognized limitation to formal systems.  This limitation is examined using Godel's results in Sir Roger Penrose's (1989, 1994) work, but has been largely discounted by the huge academic presence that First School IT has established.  The work on entailment by Robert Rosen, establishes the foundation to the Second School.  There is a natural evolution path from the First to the Second School. 

 

Because of recent development in the Second School theory, we have the means to clarify the differences between a natural system (the democracy in the United States - or Canada - or elsewhere) and a formalism (like the software that is used for federal payment of salaries) .  The formalism is in fact a complicated, non-complex, finite state machine. 

 

 

http://www.bcngroup.org/area3/pprueitt/kmbook/Chapter2.htm

 

No one designer built this finite state machine and it is each day changing....  but it is there.  

 

It is not appropriate to call this "the Semantic Web" as advocated by Sir Tim Berners-Lee.  The reason why is that the most vital part of meaning cannot be formalized and expressed as state transitions of a finite state machine.  It is simply, according to the Second School, not the way natural reality is.  Meaning has a pragmatic axis that only lives in the present moment, and which is thus observable in the formation of natural categories. 

 

Again, in reference to Rosen's work on entailment.  I call the Internet transaction space's finite state machine an "anticipatory web of information", but because the current first school is confused and excessively complicated, the anticipation is undirected. 

 

http://www.ontologystream.com/beads/nationalDebate/challengeProblem.htm

 

Once the Business-centric Methodology (your work on the OASIS BCM standard) is stratified and capable of forming information structure directly from human to human contractual agreements, the similarities and differences methodology (QSAR plus QAT) can be reviewed using the Mill's logic

 

http://www.bcngroup.org/area3/pprueitt/kmbook/Chapter2.htm

 

This "anticipatory" web of information is so different from artificial intelligence and semantic web, but delivers that part of what is advertised to be possible (distributed intelligence is emergent from human to human interactions) but has not been delivered (by Cyc Corp or others).  It does so by placing informed intelligent human discourse at the center of computer mediation of category theory.... (no costly theorem proving... merely the instrumentation of symbol systems (semiotics) arising form human awareness.)

 

The Mill's logic was considered to be the means to establish a human centric utility function over control systems by the Soviet era school of applied semiotics.  Kugler, myself and a few others are the only ones in the States who carefully reviewed this work  (1996 - 1998).  However, at the time there was no funding mechanism to support what is a totally destructive technology (with respect to 1998 IT systems).  Now days our society is well into the transformation of what everyone regards as totally insufficient IT based on  the First School. 

 

In a context of computational semantics.

 

In any case, two parallel efforts are reasonable.

 

1) insure that funding opportunities create a pipeline of resources so that major issues are not left out due to under commitment.   The work that I envision will fail at the weakest link, and so all aspects of system interoperability and functionality have to be considered and fast response made (for example when funding is almost insured but not finalized).  Then one has to be actually able to do the work as promised - even if the business office does not feel that this is important after the contracts are signed.  ( IT measured success has not always been important in the past so why should "we" actually do the work?)  In many cases, this work is experimental and the original language makes errors due to the marketing language.  (Example: the term "semantic" is not qualified as having non-formalizable elements.  This mistake is in the current BCM draft specification - but only marginally when compared with DERI efforts (W3C), which is first school. )

 

 

 

2) this effort (the Business Centric Methodology) is second school in orientation and at an advanced stage of development and early stage of deployment.  The deployment will likely follow a minority activity that has been human centric for decades, but which (due to the complexity) has found difficulty in fully deploying. 

 

My group feels that those committed to this minority effort would adopt the second school, if there was a web presence having wiki and collaborative tools, as well as a certification program (like the KM certification programs I was involved in 1998 - 2004). 

 

Thus there is the possibility that a business centric methodology movement might arise in 2006.   This would involve specific technology treatments as well as specific knowledge management practices, such as lessons learned and measurement.  You know the field and what is happening, so you can see that this prediction is likely correct. 

 

The Defense Finance and Accounting Service work is focused on issues related to interoperability of very well defined and stable systems.   It is not, for example focused solely on human to human communication ontology mediated knowledge management. 

 

Human to human communication ontology mediated knowledge management has been what I have been proposing since 1996, and is the only full solution to continuing government response competence issues.  We know that this is the single most important political issue in the US currently.  Your BCM specification is a step towards a KM deployment and has within it well isolated (stratified) KM functionality. 

 

But as you know, and I know deeply, the Congress and other Powers That Be (PTBs) constantly change the policy that controls how allocations of funds are made.  This constant change drives impedance mismatches between the "authority power" and  a simple system (as defined in the Rosen literature).   

 

In 1998 - 1999, my group addressed this specific issue, expressed in terms of agile IDEF modeling, in the later part of:

 

 

http://www.ontologystream.com/OS/MarketDelineation.htm

 

 

The simple system discussed in this note, is complicated because of temporary adjustments layering on a massive system that disputes unimaginable amounts of real money.  1 - 2 billon lines of Cobalt code?  Federal expenditure on this critical set of programs is in the 100 M range, per year.  It works, but it is as it is.  One learns to use it as it is. 

 

The only solution is to stratify the policy mechanism and depend on the (registry aware) negotiation of contracts (memorandum of understanding) that bind policy to a computational process. In this way complex systems are bound to simple systems under the understanding that there is a limited duration  in which this contract is to be enforced.

 

The system opens up due to the expectation that a new contract will, at one point in the future, be announced.  Meanwhile the simple (but hopefully not so complicated) "net centric" process functions

 

1) machine like - actually a finite state machine with measurements over all aspects

2) having service web interoperability with precisely those "other" machine like processes where there are dependencies.

 

 

 

Dr Paul S Prueitt

Taos New Mexico