Thursday, September 02, 2004
Manhattan Project to Integrate Human-centric Information Production
Stratified complexity and the origin of mental/social events, (Prueitt 2002)
Conjecture on Stratification
A discussion between information scientists
Note from Ken Ewell, Readware developer
To BCNGroup Director, Paul Prueitt
One thing different about Letter Semantics and other approaches is that Letter Semantics does not define any concepts. My question about any work on "semantic primes or concepts" is: How do the children learn them by age 2 or 3? What concept does a child of 7 or 8 have of time, space or logic.
Ann Wierzbicka’s "semantic primes" include higher-level concepts like time, space and logic. This is not what the substructural ontologies are doing.
What attracted me to Dr. Tom Adi's  work was that Letter Semantics is pre-conceptual (if we take a concept as something partially drawn out in the imagination).
The Letter Semantics matrix, or table, purely and simply encodes all the possible ways in which the components (letters, phonemes) may be combined. It does not say that they will be combined  . It does not predict why they are combined or why some combinations make sense while others do not.
The "semantic primes" of Letter Semantics are the abstract categories of identification (the capability to identify), manifestation (the fact that things appear/disappear) and order (sequence, repetition). I can easily and readily imagine how a child begins their life by learning how to identify others and themselves. Having raised five children and now having seven grandchildren, I have personal and tangible experience.
I have engaged with each of them in the process of pointing out their nose and toes, fingers and feet, mouth, eyes and ears, until I could ask 'where is your nose?' and the child would touch their nose. Children learn how to identify themselves, their siblings, their parents and their grandparents -- from all others -- in the first months of their life. Such capability towards identification is a significant event and identification is primary in this sense.
Not only that, one can see how a child gets accustomed to the comings and goings of their surroundings, while at first a child may protest at someone leaving they soon learn that the person will reappear. What impact do you suppose there is on someone's psyche, the first time someone or something leaves and does not return? The answer, of course, is that it varies according to a person's psychology and character. I would argue that how others relate to the child who is missing one parent can train that behavior (in the child) no matter how unintentionally. The important thing is that it is of significance-- ergo it is a semantic prime.
The third abstract category, order, is necessary to numbering, sequence, time, and many other notions and concepts. The ordering phenomenon is a very simple but significant component to the organization of the substructural ontology.
It is a semantic prime because it not at all necessary for one "to reason" about order, to appreciate the capability to sense it. In my opinion, it is necessary for one to actually reason about the concepts Ann Wierzbicka calls semantic primes and therefore, by that assertion, they cannot be primary.
Comment by Prueitt:
C. S. Peirce referred to this “loss of meaning” when one goes from a concept to the atoms that are aggregated into a concept. The scholarship on this “loss of meaning” centers around what is called Peirce’s Unified Logic Vision. ULV = a concept is like a chemical compound, it is composed of atoms.
The ULV is not understood unless one sees stratification and double articulation as an ontological commitment. This was the commitment, I claim, that motivated the four decades of classified research on Q-SAR (qualitative structure activity relationship analysis) in the Soviet VINIT group, including D. Pospelov and V Finn. This work leads to the development of quasi-axiomatic theory and related works.
We used our table for statistical analysis of texts and made a wonderful retrieval system with it. It had great recall, though the precision was not so stellar. You have a report on some of what we learned from the statistics (in Letter Semantics 96).
I told you we also learned about the ambiguity introduced by language change (trains/trends) and word adoption between cultures. Nathan’s work on ambiguation/disambiguation in Orb conceptual roll-up has similarities to what we found.
The problem with using automated text analysis to harvest concept structures is that you must assume the text has good or correct concept structures and enough of them to produce the widest interpretation of all the possible concepts. Where do you find a text like that? In what language should it be written?
In the end, we found that the distribution of words in a language (in texts) tells us nothing about the need or purpose of individual concepts. And further, the distribution of a given concept is irrelevant.
What is significant is that humanity has concepts at the ready and individuals can coin a new word or a new idea by combining age-old word roots and their phonemes into any new word we need.
I must correct a mistake in understanding that came from me. There is room for 4000 concepts in the Readware software we use today. We have only found slightly more than 2000.
Some concepts that we found are so under-used, we did not include them. The root for Pharoh, for instance, is not included. This root has four letters, as do a few others. The point is that we did not decompose the human mind or conceptual understanding like some have tried to do. I hope to make this clear when, if, the special project at NSA starts.
We simply singled out the all root forms and organized them in ways that our process methodology suggested, and then reified what was found.
 Tom Adi is the scientist/innovator whose work Readware letter semantics is based on.
 Please refer to the Figure 6 in Prueitt’s chapter on Pospelov’s work on this substructural ontology.
 Burch, Robert (1989). A Peircean Reduction Thesis, the foundations of topological logic. Lubbock Texas, Texas Tech University Press.
Burch, Robert (1996). Introduction to modern Peircean Logic with applications to automated reasoning, presented at the QAT Teleconference, New Mexico State University and the Army Research Office, December 13, 1996.