[48]                               home                          [50]

 

Wednesday, September 01, 2004

 

Manhattan Project to Integrate Human-centric Information Production

 

 

Stratified complexity and the origin of mental/social events, (Prueitt 2002)

 

 

 

 

Conjecture on Stratification

 

A discussion between information scientists

 

 

Paul,

 

I'm not sure I understand the idea and having some real examples would be very helpful. 

 

The idea, as far as I understand it, is to use LSI to develop a smallish set of dimensions that characterize a domain.  Of course, the dimensions discovered may not map well to our human understanding of the domain.

 

I've only been skimming the messages but I also seem to pick up on the use of character n-grams -- is this right?  We've played with using character-level n-grams in some of our IR systems, with pretty good results.

 

You might take a look at http://swoogle.umbc.edu/  and see what you think.

 

Tim Finin, University of Maryland College Park

 

 

Tim,

 

I received your note several hours ago, and have been deeply reflecting on the issues that you raise, on the nature of the issues and about the degree of care that I need to have.

 

I want to state the Conjecture on Stratification in such a fashion that those trained in computer science will understand.  The situation requires an extensive background in natural science and in the issue of complexity and the related natural phenomenon of stratification and emergence.

 

The greatest respect is held for computer science, but a speculation is made that a fundamental difference exists in how the computer scientists and the natural scientist thinks about the scientific method.  One way to approach this specific part of the discussion is to talk about the nature of induction, and the nature of formal theories of deduction.  This is addressed elsewhere in these bead games.

 

I am speculating here, but the speculation is based on some principled grounds that I have attempted to make clear in the past.  I will attempt to this make clear in this short note.  The longer version of an argument that computer science and natural science have fundamentally different objects of investigation is given in Chapter 2 of my on line book. 

 

Let me first address each theme in your note because, for me, these themes lead into an understanding as to why, a few of us conjecture, the nation needs a National Project to establish a new academic discipline.  The conjecture is just that, a conjecture.  We cannot claim to know so much more than everyone else.  I personally feel very uncomfortable with any notion that I know something that others do not know.  Perhaps we can all agree that there is more to understand and much more to do, if the issues raised in these themes can be brought to an objective status. 

 

Frankly, if the conjecture is even a bit interesting then it must serve as an important conjecture.  Why?  The reason is clear.  If living systems, and even any natural system, is placed between a substructure and an ultrastructure in the real time manifestation of itself; then the causes of system behaviors might be better understood if our science was comfortable with this type of stratified view of natural process.  The alternative seems vague and uninformed by empirical observation, for example in the non-locality effects demonstrated in Bell’s inequalities [1]. 

 

This new discipline might simplify the discipline of computer science, separate it from the industry that has arisen around information technology, and bring computer science back into the discipline of mathematics.  Mathematics and computer science share much in common, both are artificial constructions of the human mind where nations consistency and completeness have been strongly shaped by a specific philosophical tradition. 

 

The renewal in mathematics and computer science education, in K-12 grades, has to deconstruct this philosophical tradition and reconstruct formalism for modeling complex natural systems.  Accounting and engineering will preserve must of the current formalism and computer science will become more like accounting and engineering.  The new formalism for modeling complexity, emergence and stratification will have two sides, the human side where intuition and induction is allowed its full power, and the formal side where constancy and completeness issues provide absolute control and precision over those aspects of our real time environments where such control and precision is appropriate.  Again, I have written more on this, and have grounded the discussion in the literature, although I personally am very sad about how little work I have been able to actually done to make this position clear.   I hope to say more on this in bead [51].

 

There is an argument that around 200 million federal dollars should be taken from the almost 1.3 billon in direct support for computer science departments in each of the next few years, and direct this funding towards founding academic departments of a science of knowledge systems. 

 

A K –12 curriculum would be the first task, undertaken at the same time as the establishment of a software consortium [2]. 

 

The software consortium would be funded with public funds and based on new programming languages and objective evaluations of the value of data mining and knowledge management techniques.  This program would not be governed by NIST, NSF or DARPA but would have a university-based committee appointed and/or elected to represent a broad spectrum in the natural sciences and in the liberal arts.  The consortium would make peer reviewed judgments about the ownership claims on software methods [3]. 

 

As part of the foundation for this new science the federal government would create a simpler and non-commercial computer science.  This new computer science would be expressed in software that is both open source code and open source intellectual property.  We are not seeking a pure statement of socialism, since the problem is not solely the current extreme control of computer science by capital resources, as exercised by Microsoft and others. 

 

The claim is that computer science is an abstract construction whose roots have a fundamentalist nature and has reinforced scientific reductionism far beyond what most natural scientists are comfortable with. 

 

Again, the way I am stating this conjecture is not perfect.  I must apologize.  I have started these positions at greater length elsewhere.  Many scientists agree with me in private discussions about the core issues that I am representing.  But our past history has not been kind to those who have opposed the funding decisions at the government agencies and foundations.

 

I will now address the themes in your short note.

 

Your Artificial Intelligence lab at University of Maryland College Park is one that I have admired, at a distance, for a decade.  But the background that is needed to understand stratified theory comes not from classical formal theory but from the literature in the biological and ecological science, from quantum theory and thermodynamics. 

 

Most computer scientists have never read Maturana, Pribram or Edelman.  A few computer scientists have ventured as far as Stu Kaufman’s work in emergent computing or other works in artificial life or genetic algorithms. 

 

Most have adopted a common practice of ridiculing Sir Roger Penrose’s criticism of the notion that an algorithm, running on a computer, can become sentient.  Perhaps less than 3% of all PhDs in computer science have heard of Robert Rosen’s work on anticipatory systems.  Few have any knowledge of what J. J. Gibson developed as the foundations of what is now called ecological psychology.  Most computer scientists think that an artificial neural network is a three layer back propagation of error algorithm and that this is a good model of how the human brain thinks.  Most think that Godel’s theorems on consistency and completeness in axiomatic systems are not relevant to computer science.

 

And most computer scientists do not take the time to read very much of my work, of course.

 

So I can understand when any computer scientist, no matter now distinguished, says that he/she is not sure that he/she understands my work.

 

The idea is NOT to use Latent Semantic Indexing to reduce data streams into channels that exactly correspond to the substructural affordance of a complex system.  The idea is deeper than this and seems, at least in my mind to go on without end.

 

The conjecture simply is as stated:

 

Conjecture:  Periodic tables exist for each type of naturally occurring complex system.  Each of these periodic tables has a small number of "atoms" and each of these atoms has a specific set of valences/affordances from which one can predict the function of compositions of these atoms in natural settings. 

 

Ben makes an additional comment that goes to the limitations of any numerical model of substructural semantics.  [50]

 

I will add here also a link to some communications that I have had with M-CAM and others about a substructural ontology for evaluating any software patent.

 

 

 



[1] Bell’s inequalities point at some fundamental results in quantum theory related to a gap between how one understands classical behaviors of particles and how one understands experimental results.

[2] In a side discussion on mathematics education, a profession of education and I have been exchanging correspondence about why mathematics education as a discipline is in such a poor shape.  My colleague’s comment startled me.  He suggested that only 5% of professors of education where actually engaged in issues related to the obvious massive failure in mathematics education.  My experience is that most feel that the mathematics curriculum is not learnable by most students, and has become a type of endurance test.  See www.bcngroup.org/AQA

[3] The Patent and Trademarks Office has awarded software patents in an inconsistent and arbitrary fashion, producing confusion about who owns what, when almost every software design is deeply flawed.  The “simplification of computer science” will occur as a natural outcome form the reductionist nature of computer science.  This is discussed at: InOrb Technologies bead seven