Anticipatory Technology with Fractal Logical Entailments

 

Paul S Prueitt, PhD

Thursday, December 20, 2007

revised slightly March 30, 2009

 

Introduction

 

This outline was first effort to integrate the threads of basic research developed independently by a number of scholars. Several schools of thought are to be integrated. I wish to acknowledge the origin of informal collaboration, and to use attribution. My wish is to empower a deeper scientific discussion than would other wise be allowed when using the standard NDA (non-disclosrure agreements) that so plage modern science. However some attributions are not possible. The communities involved are diverse and scattered, often working independently and without knowledge of the others. My use of twitterLite technology has assisted in my work.  There does seem to be a new school of thought arising about how computing and human communication will be coupled.  This school of thought is referred to now as second school.

 

The work proposes to develop some formal models, using elements of stochastic theory and an encoding process into n-ary web ontology language [1]. For example, Laskey et all proposed in 2005 a straight forward extension of the OWL [2] standard that encodes probability patterns, for Bayesian analysis or perhaps other kinds of stochastic knowledge constructions. This n-ary ontology language is called PR-OWL, with the 'PR' standing for probability. My work on the blank slate Internet builds from the PR-OWL constructions, while adding the elements of tri-level architecture and quasi axiomatic theory, also called topological logics. I discuss topological logics in chapter six of my book 'Foundations'. [3] Laskey's work has some similarity to several potential software designs, several of which my collegues and I have studied.

 

We wish to define an economic vehicle that, if used, might allow the expression of a fractal entailment [4] basis for anticipatory technology. [5] This expression could have great economic potential but faces a solid resistance from the first school of thought (related to the AI polemic). The potential may be examined formally, and in fact this work was undertaken using virtualization technology, significant computing resources and work in progress by my lab.  It may be predicted, from formal work now being funded, that markets will eventually exploit this potential, for reasons that we wish to make clear. There are two approaches to making this clear. The first is in philosophical arguments. The second is in building an actual computer based distributed platform where mechanisms for a new economy are in place.

 

As discussed below, nature appears to organize around self-organization within organizational levels. This is the essence of stratified computing theory. In particular, the machinery used by living brains in creating an anticipation of real world experiences is discussed in my 1988 dissertation and in a short paper. [6] These mechanisms have so far not been incorporated in artificial intelligence systems because the mechanisms explicitly acknowledge a non-algorithmic aspect to human consciousness (as discussed by Roger Penrose and others). [7] see in particular  the work by Robert Rosen.
 

We wish to make technical suport of an anticipation of real world experiences common and available to all individual humans and without ownership. This means the "termination of ownership", over all fundamental notions and formalisms of computer science and discrete mathematics.  We propose that an integration effort should occur in a framework that is not classified and which will be used by everyday people in the context of consumer markets.


Why? There are two answers, one political and the other moral. Without public awareness of the issues raised regarding collective intelligence it is unlikely that certain constitutional rights and national security issues can be reconciled.

 

Consistent with the design of the 'glass bead game' theory [8] the governing body will be not "public". The reason has to do with attribution and the private effort required by the founders to create the technology and procedures for bead game play. The bead game design has, since 1994, three groups of players. The inner group is composed of real bead masters. These masters must be known to each other and have agreements regarding collaboration and intellectual property. The bead games of the Masters will be recorded and then archived as examples for others to study.

 

Our work has followed a set of protocols in our research deployment of an anticipatory technology. These protocols will be understood the world over, due to the popularity of the book, "Majestic Ludi: The Glass Bead Game". The protocol will lead into agreements consistent with the BCNGroup Charter. [9]

 

A fractal logical framework is to be created based on an integration of selected scholars' works. This integration task is not so great a task, when compared with the works already completed. The deployment of the framework is then an outcome of some small amount of work to be completed soon. Initially a specific group of individuals are invited to have membership in the governing body. This body will reflect the interests of scholars.

 

Specific objectives include the finding of funding to run a think tank this summer (2008) for two weeks, either in Canada, EU or other location, and to support preliminary work on Anticipatory Technology with Fractal Logical Entailments. We have found a funded position for two years to pursue this work. Additional work is being done under the heading of the second school.

 

Game play

 

First we state the principle assertion of the second school. It is essential to the second school intellectual position that we assert, due to specific arguments that are made elsewhere, that the individual human or living system is capable of awareness. We also assert that computer programs are not.

 

Now, we may address the framework we propose for bead game play. A technology will be developed that rewards entities, when anticipated conceptual formulation is judged by a fractal logical framework to be "in play". To be in play, a new composition will have a fractal linkage to previous beads expressed in similarity measures.

 

A review of the BCNGroup bead game design is helpful. [10]

 

Similar to how the wiki works today, an evolving body of knowledge is to be developed by individuals who are acting anonymous as members of a community. The play of the game is to be orchestrated by the Magister Ludi, consistent with the behavior expressed in the book by the literary author Herman Hesse. The Magister Ludi is a functional role with fictional characteristics revealed in Hesse's novel.

 

We have prototyped the fractal framework with digital music inputs. This prototype is far simpler than a full Herman Hesse type bead game with linguistic processors. The branding language includes the phrase, 'let the bead games begin with music'. The business plan includes the development of 25 channels of satellite audio supporting real time orchestration of individual creative musical expression. The music d-GBG is, will be, a global jam session, experimenting with the notion that real time expression is not the same as recorded digital products. A Creative Commons license will be created to terminate ownership of the output of the music d-GBG. Artists will be compensated with attribution and with money proportional to the number of listeners. The technical details are held in common between a group in Monterey California and the BCNGroup.

 

An essential element found in contributing scholars' works is that of the concept of stratification and fractal expression. We both observe that the expression of humans, social systems and other systems of living beings involves more than what deterministic mechanics allows by itself. I have developed a software interface design based on the encoding of human responses and class - subclass taxonomy, into a logic system having a fractal entailment.

 

The underlying architecture for an intelligent back plate is tri-level and anticipatory in nature. Several supporting ideologies are supportive, but each will be examined closely by the BCNGroup Founding Committee.  For example, Neuro Linguistic Programming (NPL) principles will be used only as a primer designed to teach about self-limitation. NLP principles will be deepened using Briggs-Meyers, and a number of other systems for archetyping the individual. Deep linguistic theory developed by Adi and Prueitt [11] will be combined with a general framework theory [12] to produce a knowledge operating system design to be used by a single human user.  The point is made that all advanced work on sematnic technology is to be introduced with clear exposition of the issues involved.  

 

A digital generative backplate (dBP)will be possible, using the technology designs developed by Prueitt. [13] This concept involves the use of distributed compression dictionaries and a linkage between compression and decompression and the development of indexing based on, as discussed above, a logic system with constructions similar to the PR-OWL language, equipped with Soviet era topological logics and an inference engine based on the extended Mill's logic (Prueitt, Foundations, Chapter Six). A measurement process will be used at several levels of organization. For example, the individual, the emerging group, the stable group, etc has images of self that are layered and nested within other systems. This work builds on work by Prueitt and Stephenson (2005) [14] and Prueitt (2004) [15].

 

How is this work to be understood

 

Anticipatory technology with fractal logical entailments may not be so difficult to understand. Some conceptual imagery is possible, and this may help.  

 

Imagine a fractal encoding of a digital picture. In this fractal encoding there is, in fact, an inference (or entailment) mechanism. The fractal is a small matrix that is used to process the color and intensity of individual pixels in the digital image. In the decompression of a fractal encoded digital image we may iterate beyond a certain point to guess what is not seen in a digital image. Knowledge might have a similar digital representation.

 

The fractal logical entailment is a mechanism that processes linguistic input. Mapping this representation to various knowledge frameworks will follow the work by Prueitt, Adi and Prueitt, and by Prueitt and Stephenson.

 

In the digital image, the retrieval mechanism is simply the iterative processing of the number of iterations allowed in the decompression before the image is said to be complete. If the number of iterations is set higher, then one may see into the new digital output additional detail that was not in the original image. This is the principle that is exploited by our work on fractal logical entailments.

 

One of the paradigmatic assertions of the second school is that phenomenon expresses at various time scales in a self-similar fashion. This means that anticipation, including human intuitions, is built to be sensitive to these patterns.

 

In summary: In our mutual theory we see that a fractal entailment might actually underlay physical existence, and thus be responsible for what we regard as our human sense making and inductive capability. The principle is applied to the reification of ontological structure composed of universals and stated as concepts, from the experience of particulars by human beings. Of course, the technique is not as simple as the use of fractal compression. In the digital image, the expression mechanism is simply the iterative processing of the number of iterations allowed in the decompression before the image is said to be complete. If the number of iterations is set higher, then one may see into the new digital output additional detail that was not in the original image. This is the principle that is exploited by our work on fractal logical entailments. The fractal logical entailment is a mechanism that processes linguistic input.

 

In the next section we address an area of active application. This area has seen success in several important economic sectors, in particular medical science.

 

Application of anticipatory technology in the automated understanding of research literatures

 

The continuous pursuit of knowledge has resulted in the classification and the development of a variety of specialized disciplines of knowledge. These pursuits benefit from the advancement of the analysis and understanding of various causes. These causes include what is often referred to as natural law, gravitational affects etc; but also includes social causes and personal expressions of free will. By causes, we mean the full entailment of phenomenon of any kind.

 

Collectively the results from human pursuit of knowledge do form the sum of our perceived knowledge. This sum represents our collective attempt to explain and further our discoveries. There have always been issues of self-limitations related to the advancement of science. These issues are important to our proposed use of anticipatory technology. Automated and systemic processes are attempting to synthesize the advances developed by scholarship. As this process matures, we are faced with issues related to what might be called the 'rational model.' The question arises about the possibility of a 'theory of everything'.

 

During the period 1994 - 2009 my numerious research proposal [16] discusses the phenomenon of coherence in the context of self-limitation and the human need to act rationally within some viewpoint. National scientists have asserted that the human sense of rational coherence and viewpoint is part of human discovery. This sense of rational coherence can be; however, the causes of barriers to understanding two viewpoints, with separate cultural groundings, at the same time. We assert that there is not and cannot be a single viewpoint which is fully universal. This assertion is consistent with the linguistic theory proposed by Benjamin Whorf [17].

 

Various technical challenges, in the context of web ontology languages, are related to the issue of conceptual coherence and rationality. These challenges are seen as philosophical, and that perception is part of the barrier we have found. The challenge is real. Without addressing this challenge the inclusion of probabilistic or stochastic models of knowledge, as in Laskely et al, will not completely resolve the knowledge acquisition nature. The challenges are seen in failures to define well-specified web language for the merge of taxonomy and description ontology. The failures are also at the root of collective responses to fundamentalism, and helpful in bringing social awareness to these roots. The technical support for shifting viewpoint is seen, also, in everyday living.

 

In many current data management systems, taxonomy and description ontology is used to organize textual data. However, so organized, the data does not fit within a fractal or anticipatory framework. Current data systems have limitations because of fixed organizations and because there are no substructural generative mechanisms. The most common challenges are seen in failures to achieve reconciliation of cultural and personal conflicts. The approach I am taking re-defines these challenges and by-passes the merge and discovery concerns framed by the W3C standards for description logics and RDF. The Topic Map paradigm is more fully used, but in new ways. The back plate is to be realized.

 

The issue of rationality has many manifestations. However, perhaps nowhere are the positive and negative aspects of rationality seen more in how we manage our cultural knowledge. Human discoveries address various sequences of past events and enable an anticipation of future events. This occurs both formally, and is expressed as mathematics, and informally. Up to now, mathematics has been used to model only those events that are modeled in deterministic terms. For example some of the classifications for these disciplines are engineering, chemistry, biology, economics, health sciences. But there are also other disciplines such as religion, politics and various beliefs and others, including music and literature. All of these classifications are organized into disciplines, each with a unique viewpoint in which limitations are intuitively understood. Can all of these be modeled using mathematics, as classically understood? In the foundational work, in logic and in set theory, we find that this possibility has some constraints.


The new science

 

We are creating automated frameworks supporting our collective understanding of the complexities of 'total knowledge'. An excellent example is the key bio-informatics cell signal pathway and gene expression ontology. [18] In making this effort we 'differentiate' and classify. Often, if not always, this differentiation uses perspectives expressed with contextual nuance. A need for contextual nuance has always been critical, and will be true in future automated synthesis of human knowledge.

 

Mechanisms need to be in place that account for contextual phenomenon. Our proposal will deploy such mechanisms.

 

We pursue a deeper understanding within each respective field of specialized knowledge. This is an ontologically assisted extension of normal scholarly activity. The knowledge we seek also includes results from psychology and sociology studies. We seek a better comprehension of self, of our self and the selves of others. In particular we are interested in clear knowledge regarding the fundamentals of human behavior. Again, we see contextualization as an essential part of the experience of knowledge, and even more so when we attempt to understand self. The contextualization seems to need to shift as one moves from one focus to another. By developing contextualization mechanisms we hope to properly focus our integrated work on mining emerging scholarship. The purpose of this work is to accelerate our ability to express positive collective activities.

 

The requirement for advanced methods arises because the emergence of new thought requires a sorting of sub-thematic structure into categories and context. A discussion of NLP methods will be revealed in the context of a criticism of the science up to this point.

 

Various techniques are proposed that involve the further categorization and differentiating of self-similar components and the evaluation of the extent of these component's causal relationships and perceived impacts upon individual and group behavior. In the glass bead game terminology, these components are the glass beads being put into play by the bead players. The bead game provides the social context to the development and deployment of very advanced collaborative technology in the presence of advanced knowledge management technology. These collaborative technologies are formative and agile and create NLP like interfaces where individual action perception cycles provide the formative energy. Thus the back plate supports individual self directed discovery.

 

The development of individual knowledge may proceed based on process models. Models of this type are still subject to some high degree of controversy, and are not used as much as one might see in the near future. How is science to be advanced, if it is to advance beyond materialist roots? How does one develop science about the natures of human cognition and awareness? Agility is needed. Such an agile process model may be seen in scholarship on topological logic. [19] Other process models are being used in everyday enterprise management. I have some experience with all of these models.

 

How is self directed discovery enabled?

 

Some of the modeling processes differentiate rather than integrate. Differentiation is required to further the understanding of the relative importance and influence of the various sub-components. The process of differentiation is in fact a process that produces what I have called 'categorical abstraction'. The formation of categorical abstraction [20] then results in a substructural ORB (Ontological referential base). With ORB encoding we have a provably minimal data encoding, and thus one more level of innovation and utility. Other properties of my key-less hash table technology are available for use in the tri-level architecture having selective attention and orientation mechanisms as discussed in my "Research Proposal". When these mechanisms are commonly available, the individual may feel empowered by the play of the bead games.

 

It also has been long recognized by sociologists that the decision making process, especially when done in a climate of uncertainty, are not just products of rational judgment, but also reflect heuristic shortcuts which are susceptible to individual biases. Our group's proposals follow classic work on the levels-of-organization hypothesis and the epigenetic principle. [21]

 

Following Bertalanffy's work, Prueitt's stratification theory argues against the concept of reducing higher levels of complexity to lower levels. An interactive model developed within stratification theory may best capture and describe decision-making process.

 

Comparison to other methods

 

Certain criticisms are made regarding numerical models of concepts, and logical entailments seen popular in web ontology languages and in the schools of artificial intelligence. This criticism suggests setting aside hard forms of knowledge engineering with an alternative long advocated. The alternative is called the second school. [22]

 

Our methodology has an inherent potential to formulate and analyze logic-based problems and dilemmas, which exist in real-life, in a more structured and complete fashion than other existing methodologies. Most of these methodologies are either number-based or heuristic in nature. Numerical and heuristic methods includes most of the conventional engineering methods of artificial intelligence, methods which are also used in sociology, physiological medical treatments, inter-personnel and inter-social conflicts, etc. The methodology can address problems and issues to further enhance the nature and scope of social-stratification dimensions (including power, prestige and wealth), systematic treatment of group life, social institutions, social problems, social change, and social control. This methodology should be revealed in a game form, so that scholars might shine light on constitutional issues.

 

Therefore it is important to review and compare the current state of the art modeling techniques, such as, Expert Systems, Fuzzy Logic and others to illustrate the advancement of our methodology over these and other existing state of the art of modeling techniques that are generally numerically based, or are based on the limited nature of description logics and ontology web language. In complex problems, the findings of various disciplines can be categorized into tangible and intangible components and events. This categorization may not in fact be completely reducible to numerical models. Klaskey points this out in his paper on n-ary representation of the structure of probabilistic reasoning. The case is made that concept-based methodology is essential to the kinds of Internet based collective experiences we are envisioning.

 

Logical Proportional Analysis

 

Logical Proportional Analysis [23] is a self-contained non-numerical process. It is designed to interact with humans. The processes supporting proportional analysis do mimic certain specific aspects of the human logical thinking process. The analysis seeks to identify and then measure occurrences of proportional rations between patterns that are expressed at different time scales. As such, proportional analysis generalizes the well know fractal encoding and decoding of digital images. The analysis fits over 'raw' data that might be acquired from any real time expression of any natural system, including an economic system or the expression of a single human in text, or the expression of a group of humans. Prueitt and Stephenson (2004) [24] and unpublished papers by Prueitt suggest one class of applications. These applications are to the measurement and analysis of patterns of cyber attacks, vulnerabilities and response mechanisms. Evidence for fractal composition of cyber security data is suggested in my private work with Stephenson.

 

A number of mechanisms involved in the biological response to stimulus have a computational model. These models are reviewed in Levine's book. [25] A specific approach to modeling biological mechanism is found in my 1988 PhD thesis [26], and is derived from the traditions of A. R. Luria [27] and Karl Pribram [28]. The algorithmic implementations of these mechanisms models these can be used to reify ontology web language based universals from particulars seen in measurements. These aspects include interaction, as well as the mapping and transformation of components and events in a causally entailed relationship. The processes, proposed in my 'Research Paper', are bounded in a way similar to the limitations of human perception. They are also unlike any other existing state of the art comprehensive modeling techniques, but have some well specified historical roots in certain disciplines known in science communities.

 

In proportional analysis the causal characteristics of human categorization of knowledge of various disciplines is integrated into a class - subclass hierarchical components and events. This process was illustrated in many real world examples. What is different is both the biological response mechanisms, as discussed above, and the fractal analysis as expressed in a structured proportional analysis. The system is dynamic, and simple in its algorithmic implementation, and thus the hierarchical structured formation of events is an evolving causal transformation of the organization of data. These transformations can be used in many ways. Specialists trained in specific areas do not have to have a commitment to ontological structure determined by knowledge engineers. The system can run on real time data and can immediately produce topic maps about the observed structure.

 

The transformation is a process that involves multi-resolution lower level hierarchical entities and events. The simple underlying architecture allows easy inspect of the formative processes involved in specifying thee entities and events. The assumption of fractal entailment supports a logical fusing process and formation of higher-level entities of increasing complexities. Human inspection of results in real time allows neural network type reinforcement learning to occur. Individual humans can comprehend logically based causal relationships without necessarily knowing the detailed composition of the components involved. This is due to a separation of structural forms and an ongoing assignment of meaning via reinforcement learning mechanisms. A drill down into the layers is easy and always possible. Only the knowledge of structural relevance and of the causal interactions of sub-components is required in order to know the structural forms.

 

The fundamentals of numerical modeling and solution process involves a concept by the mind which is then transformed into words and expressions, then into numbers, then a numerical solution is obtained which is interpreted in words and then mentally realize and interpret the numerical outcome. At each stage of the solution transformation processes, there is a loss of 'something'. This loss of something impacts the accuracy of the final solution outcome. One sees a mention of this loss in the classical discussions by A. H. Whitehead about the nature of induction. The ideal is that ‘mind to mind' translations exist. However, the mechanisms underlying the computational support for this kind of process should be simple, as the Orb technology is, and architected using the stratified theory developed by myself. There should be no mystery as to how this translation is achieved.

 

The suggested integration of methodologies may be considered as a methodology; from mind to words to word-string solutions. This is because the human mind is seen as part of the loop. There is an interpretation of words and then the mental realization of the results. It is also an evolving concept that is seen within the larger evolution of advances regarding our understanding of human perception and knowledge.

 

Advantages and Relationship to other Existing Techniques

 

Cognition-related methods may be confused with number-based techniques. Specifically those that may appear similar pertaining to Artificial Intelligence; Fuzzy Logic, Genetic Algorithms, Neural Networks and Expert Systems. This section attempts to illustrate the key differences.

 

In Fuzzy Logic the user must quantify the input parameter to obtain the corresponding values of the 'membership function'. The processes of 'fuzification' and 'defuzification' are number-based and are so programmed. Two users, with different 'perceptions' would arrive at different conclusions in Fuzzy Logic, even if they both employ the same membership functions. More realistically, two people with different 'expectations' may conclude that the standard of living as "good" even though their incomes widely differ. The Union Rule Configuration (URC) in fuzzy logic primarily eliminates the Combinatorial Rule (CR) 'explosion', however the process is numerically based.

 

Similarly Genetic Algorithms are essentially combinatorial evaluation and optimization techniques. The user must quantify the so-called 'fitness function', which measures the degree of fitness (favoritism) of a given population. All processes of 'mutation', 'crossover', etc. are essentially numerical assignments, which affect the formation of the resulting 'genomes'. No margin is given for logical interpretation and manipulation of the problem input and output entities

 

The Artificial Neural Network (ANN) recognizes patterns and interrelationships in problem inputs. Defined outputs result from past knowledge and experience obtained during the training of the ANN on a number of training sets. Both formulation and processing of the ANN technique are number-driven and the training is based on many input-output scenarios. Neural Network techniques are number-driven and the training is based on many input-output scenarios, all of which are pure numbers. In contrast n-ary formation of stochastic patterns requires only definition of the logical structure of the entity that perceives the inputs and decides on the outputs. The patterns evolve to higher levels and eventually to a generalization of the same problem, which may not have been originally comprehended by the developer. However, as the research proposal by Prueitt points out; ANN architectures are able to provide to systems certain orientation features useful to living systems.

 

Expert Systems are tools in Artificial Intelligence (AI) and have a relatively straightforward formulation, however, some limitation on the type of problems that could be handled. The 'IF/THEN/ELSE 'clause structure of the 'Rules' is used to define pairs of 'Premises' and 'Conclusions'. The significant difference is that Expert Systems are intrinsically passive, strictly rule-checking schemes, and can only reflect what the 'user' knows (the contents of the 'knowledge-base'). Fractal pattern analysis however deals with the user's dilemma as he/she perceives it, and not as a mere 'pass'/'reject' verdict given to each rule in the solution process.




[1] Costa, Paulo C. G.; Laskey, Kathryn B.; and Laskey, Kenneth J. (2005) PR-OWL: A Bayesian Framework for the Semantic Web. Proceedings of the first workshop on Uncertainty Reasoning for the Semantic Web (URSW 2005), held at the Fourth International Semantic Web Conference (ISWC 2005). November 6-10, 2005, Gal:

http://ite.gmu.edu/~klaskey/papers/URSW05_PR-OWL.pdf

[2] OWL standards for Ontology Web Language and is a standard of the W3C.

[3] Prueitt, Paul S (on web) 'Foundation for Knowledge Science in the 21st Century' URL:

http://www.bcngroup.org/area3/pprueitt/book.htm

[4] Logical entailment is expressed as a fractal in Rimas Slavickas's work. A review of this work is to be made available to members of the governing body.

[5]Prueitt, Paul S (2005) Developing Anticipatory Responses from Thematic Analysis of Social Discourse http://www.ontologystream.com/beads/nationalDebate/challengeProblem.htm

[6] Prueitt, Paul S (unpublished - 2008) 'A Research Project on Mechanisms, known to be involved in learning'.

[7] Penrose, Roger (1993) 'Shadows of the Mind'

[8] Prueitt, Paul S acting as Founder of the BCBGroup : URL:

http://www.bcngroup.org/site/beadgames/index.html see in particular

URL: http://www.ontologystream.com/area1/primarybeads/bead3.htm

[9] BCNGroup Charter: URL: http://www.bcngroup.org/site/aboutus.html

[10] Bead One is one of three foundational beads posted in around 1998. URL:

http://www.ontologystream.com/area1/primarybeads/bead1.htm

[11] Adi, Tom (2004) 'The Adi Ontology, Part 1 – Part III'. URL:

http://www.bcngroup.org/beadgames/generativeMethodology/AdiStructuredOntology-PartI.htm

[12] Prueitt, Paul S. General Framework Theory is developed in a number of web pages and in unpublished documents.

[13] Prueitt, Paul S (2008). The Blank Slate Internet (Private document)

[14] Prueitt, Paul and Peter Stephenson. "Towards a Theory of Cyber Attack Mechanics." First IFIP 11.9 Digital Forensics Conference. Orlando, FL, 2005

[15] Prueitt, Paul S (2004) 'Notational Foundation to Future Semantic Science'.

Unpublished except on the web at URL:

http://www.bcngroup.org/area2/KSF/Notation/notation.htm

[16] Prueitt, Paul S (unpublished): 'A Research Project on Mechanisms, known to be involved in learning'

[17] Whorf, Benjamin Lee [1933] (1975). The Phonetic Value of Certain Characters in Maya Writing. Millwood, N.Y.: Krauss Reprint. 

[18] This project is detailed at www.biopax.org

[19] See references to works by Victor Finn in Prueitt, Paul S: Chapter Six, Foundations.

URL:

[20] Prueitt, Paul S (2007) A Research Proposal (private document)

[21] See also Bertalanffy, 1933; Schneirlia, 1957

[22] Second School web site: URL; www.secondschool.net

[23] Slavickas, Rimas

[24] Prueitt, Paul and Peter Stephenson. "Towards a Theory of Cyber Attack Mechanics." First IFIP 11.9 Digital Forensics Conference. Orlando, FL, 2005

[25] Levine, Daniel (1991). 'Introduction to Neural and Cognitive Modeling' LEA.

[26] Prueitt, Paul S (1988). “Mathematical Models of Biological Mechanisms exhibiting Learning'. University of Texas at Arlington.

[27] Luria, A. R. (1973) “ The Working Brain'. Basic Books

[28] Pribram, K.H. (1971). Languages of the Brain, experimental paradoxes and principles in neuropsychology. New York: Wadsworth.

Pribram, K. H. (1991). Brain and Perception: Holonomy and Structure in Figural Processing. Hillsdale, NJ: Lawrence Erlbaum Associates.

Pribram, K. (Ed). (1993). Rethinking Neural Networks: Quantum Fields and Biological Data. Hillsdale, NJ, ERA

Pribram, K. (Ed). (1994). Origins: Brain & Self Organization . Hillsdale, NJ,

Pribram, K. & King, J. (Eds) (1996). Learning as Self-Organization. Mahwah, NJ, ERA

Pribram, Karl (1993) (Ed) Rethinking Neural Networks: Quantum Fields and Biological Data, Hillsdale, NJ, LEA

Pribram, Karl (1994) (Ed). Origins: Brain & Self Organization. Hillsdale, NJ, LEA