[9]                                home                        [11]

 

 

 

Key questions on Common Upper Ontology

 

 

4/19/2004 8:44 PM

 

Note from Paul Prueitt to SICoP

 

My sense is that John Sowa's comments about replacing the word "ontology" with "software specification" is intended to express the frustration we have.

 

I get the impression that Patrick Cassidy is known to most on this list.  I do not know him.  The arguments he makes are strong and with great feeling.

 

However, there are beliefs underlying his arguments that can be examined one at a time.  Of course, much is said about the need for common usage of taxonomy and logic. I use these words, “taxonomy” and “logic”, here rather than “ontology” to talk about knowledge representation of the type developed by various communities.  I do this, just temporally, just to make the point that many who study ontology see a parallel from “ontology” and “epistemology” to “gene” and “phenotype”. 

 

This is not a philosophical issue, but one where we try to identify the science related to the expression of knowledge.   The core element of the parallel is the phenomenon of emergence.  The gene expresses as phenotype, and that expression involves emergence and thus is not reducible to Newtonian mechanics (or first order logics).  The ontology expresses as, among other things, a way-of-looking-at or an “epistemology”.  I hope that we will all agree that these things are not well understood.  My own work has been to bring some neuroscience to bear on what human experience of mental events are (physically?) and to thus suggest what might be a formative process that produces knowledge representation via some type of machine / human mutual induction.   

 

My notes make it clear that the natural science indicates, to me, that the induction is one way, facilitated by the computer but nevertheless still one way.  In my work, Ontology referential bases (Orb) constructions evoke mental state transitions in a weak fashion; but in a way similar to how language evokes mental event transitions.  The simplicity of the Orb work is that the human mind is used in a mutual induction about the meaning of co-occurrence patterns.  The types of meanings and the nature of relationships between things having meanings is not something that can be addressed in a shallow fashion.  And I am not sure who is talking about a project to develop this level of knowledge about social and psychological behavior.  My sense is that most in the Semantic Web community have too much computer science and not enough natural science to be serious about this work. 

 

It is not necessary that the nature of a computer be confused as on equal status to a human mind.  The issue of induction of computer state transitions by humans in a tight feedback loop is something that we all know is going to be future science.  But right now, we talk about something else.  We talk is if knowledge representation can exist in a computer in a way that is similar to the way the mental event exists in the mind.

 

So the term “ontology” as used by most at the Semantic Web meetings is not in line with some thoughts by other kinds of scholars with respect to what the word “ontology” should mean.

 

 

I hope to pick just a single issue:

 

Patrick wrote:

 

<begin quote>

 

   Another of Irene's comments cites an issue often raised:

 

[IP]

 >  The attempts to develop common upper ontologies have been active for

 > many years, probably nearly twenty now, with so far no success. In some

 > ways Tim's ideas can be seen as a response to this lack of success.

 

     The short answer is that there has in fact been **NO** project to develop a common upper ontology that was adequately funded to do it in the only way that has any chance to succeed -- to bring together representatives from the dozens of different groups and communities that have been concerned with the substance of knowledge representation, to discover precisely where agreements and disagreements lie and to forge the maximum common knowledge base.  

 

</end quote>

 

Here is where my disbelief occurs most acutely.  What does one mean by "adequately funded".  How many millions or billions are you talking about, and how long?  What has been the actual investment in time and resources on meeting and discussions? 

 

But perhaps Patrick is talking about the deeper sense of the notion of ontology that would have it align with what neuroscience would be comfortable with.  If that is the case, then we need to go more slowly and get some foundational concepts examined even if the business folks and computer scientists are impatient with us. 

 

Perhaps a point of discussion can be Tony Tether's testimony to Congress. 

 

http://www.bcngroup.org/beadgames/graphs/eight.htm

 

Are the concepts that he is talking about something that can bring the discussion into a new light? 

 

Patrick wrote

 

<quote>

 

Some funding has been provided for knowledge representation languages, but very little for actual content.

 

</quote>

 

I think that John Sowa's note is reflecting the history. The Nation, corporations and individuals have been funding software specs (or software languages, etc; and not content.)   

 

But perhaps the argument is that content should evolve organically, and that for organic evolution of content one needs to have a theory of ontology that is more deeply informed by natural science.  By natural science I mean science that deals with natural complexity and the emergence of meaning as experienced (reference to C. S. Peirce's work).

 

I have written about a view that information requires an interpretant for it to be experienced, and that this cannot be expected to occur in a silicon process with digital memories. 

 

We can computationally simulate some aspects that we think are reflecting the physical and natural processes involved in the human experience of knowledge.  So there is much to talk about.