The Blank Slate Internet


A paper for the purpose of discussion

Dr Paul S Prueitt

Draft: December 10, 2007
Revised slightly December 15th 2008



Section 1: Definition of back plate

Semantic Cover Generator

Structural Cover Generators

How will it work?

Section II: The second school position on computing environments

Implications of second school thought on human computer design

Second school computer-human interface design

Encapsulated Digital Objects and the back plate

On the usability of systems that have a back plate

Section IV: History

Ontology emergence and merging

On universals and particulars

A model that corresponds to natural process

New technical means, the n-ary ontological model

Section IV: The Back-Plate and Digital Rights Management

The language of compression

The measurement of categorical invariance in data

Measurement is followed by encoding of data

A new retrieval and organizing principle based on Mill's logic

The fractal nature of information

The new model based on secure IP management

Section V. Bi-lateral Intellectual Property Management





The Blank Slate Internet (BSI) is a concept that has developed based on our experience with the current Internet and current software development. This experience suggests that certain key enhancement to our economic and social system might be easily developed. There are social as well as technical enhancements to discuss. 


To engage in this discussion, there are very hard issues to be sure. So most of the time we frame our discussion as conjecture, particularly when talking about human consciousness or social system phenomenon like economic realities. The framing as conjecture helps us be clear about our own limitations, limitations we experience as we attempt to talk about topics like collective intelligence, non-locality and even spiritual concepts. 

The world economic system is a marvelous system but may produce unsustainable pressures on our social systems and on the world's environmental systems. Supply side markets appear to have failed in significant ways, and a balance is needed between what might be demanded from "all" people and what is supplied, or more correctly stated "what is design to be supplied, by a few people.  The issue of about the origin of control.  To develop a demand side, clear and transparent communication seems necessary.  An evolution of a framework and infrastructure for information exchanges is suggestive of changes that would seem to be positive in regards to modifying wasteful consumption patterns. So our discussion is about how this evolution might be aided.


Core to the current economic system is the information that flows in the Internet. The observed un-sustainability of many essential systems, is resilient because expectations are established based on a specific type of business context. It seems not to be a closely held secrete that social values are seen, in this context, as secondary to the specific type of private economic gain supported by current property law. The results of this context are everywhere evident. As such, a new system is sought that establishes a blank slate for computing and communication using microprocessors. If this system is sufficient to allow and enable a cultural shift, then positive benefits may be found in supporting precisely such a blank slate.


The second school may conceivably enable a new economic model that is resilient and sustainable. One key to the second school is based on a correspondence between design principles and natural science. Several elements are present, including a paradoxical principle of transparency and encapsulated informational security. This principle may be realized using the back plate as described in this paper.


Our current design work shows precisely how transparency and security may be mutually supportive. A social agreement over what is to be publicly transparent, and transparent within social units, may be made and provable security over information content provided. Such an agreement has to be politically empowered and to have a technology that is neutral to ownership issues. This means that the software itself has to be optimal and provisioned by our social institutions, and agreements have to have grounding in constitutional law.


Social agreements, of the type we envision, simply confirm constitutional law and are thus enforceable in cases of attempts to infer and gather private information. However, public information may become more open and transparent. In particular real time public information about the composition of all manufacturing and all commodity use must be seen as a matter of national security and public well-being. For without this clear information there cannot be a market in the sense of Adam Smith's theory of market forces.  We as individuals can not see the consequences of a purchase if the supply chain that produced a commodity is not available for inspection.  


It is not just a question of the technology. The mechanism of the back plate could in fact provide provable security while algorithms derived from link analysis creates a market place where consequences of social or economic decisions are well understood. Our current experience, largely in the classified world, with semantic technology demonstrates the feasibility of such a transparent market.  The argument against creating communication infrastructure having the property of translucence *<*> is often made by liberal elements of the political world.  We make an argument about the nature of the physical world and the differences between the physical world and the abstractions involved in human communication. 

In short and in summary, whatever reality is it should be seen by everyone, unless the reality is private, and then this privacy must be perfectly protected. The basis for this perfect protection is in the Constitution and in out moral traditions.  Knowing the difference and being able to enforce this difference is essential to a conjectured future market place.  There is no way around this.  Our society must accept the modern world and the technology that empowers us to respond to crisis and opportunities. 


Complete and perfect public clarity about the real cost of all goods is one element of the envisioned consequences of back plate systems. Carbon management, and the management of other elements can be instrumented using a back plate.  Such a back plate mechanism was designed for U. S. Customs, but not implemented. [1] [2] The question of private information, when this information is about commodity use by manufacturing processes, is not the same as the issue of private information about personal lives or individual human being. The differences are not so easy to delineate. The problem is that knowledge of supply chains and origins of supply and demand is power.  In one case, there is an agreement about the rules of economic interactions. In the other case, there are the constitutional mandated rights of privacy.  Power expressed privately may in fact have negative social consequences, and this possibility is to a large extent that cause of social law.  


Section 1: Definition of back plate


What is a back plate? The concept can be applied to any type of finite state machine having a stratification of processing layers. Stratification in physical systems is seen as a consequence of the organization of physical systems. In the physical support of human thought, there are layers of organization in which this organization has interactions between elements leading to the emergence of coherent phenomenon serving functions within other organizational levels.  When a communication system infrastructure realizes this organization, we are able to establish appropriate correspondence between natural intelligence and the communication architecture. *<*>


The Internet has the FTP stack and this certainly is one way to implement stratification. However, the stratification potential has not been optimally used, in the way that the back plate concept suggests.


The back plate, as defined by Prueitt in 2007, is a system of compression/encryption dictionaries that communicate in the background.  The background communication is minimally sufficient so that a generative capability comes to exist at a number of small computing nodes. The specification of these nodes is addressed in several design documents, and there is on going work on these designs. This work may be reviewed when proper agreements are made to protect some parts of the specification that is original and deserving of some degree of temporary ownership.  We may; however, talk in generality. 

In essence one has a very small operating system that continually takes background information to maintain a generative cover, as discussed below, as well as an ability to enfold and express digital objects. The operating system is mobile in the sense that it can be activated by any micro-processor, such as exists in cell phones.


Internal to the back plate nodes are rather simple optimization algorithms. These algorithms are useful because of the input / output relationship is handled in iterated action-perception cycles with what are called utility functions. Examples of this type of architecture are ubiquitous in the mathematical models of neural function as well as in various types of automated control.

A longer discussion is required here. A summary of this discussion must point out that computational systems have up to now only allowed the modeling of biological function using programming languages that separate the processing of data by the program from a physical measurement process.  Human awareness, it should be pointed out, depends on the systems supporting awareness to have measured and to be capable of measuring in real time some part of physical reality.  Computing machines do not have this same type of measurement.  The issue of measurement is an open problem of extra-ordinary difficulty.

The issue is that current programming environments are designed for purposes other than simplification. Without a simplification of the computer science, the natural scientist cannot do the work that needs to get done. On the other hand, great strides in natural science seem just on the horizon.


The back plate nodes are designed to simplify the input / output relationship specifically supporting the use of utility functions over an aggregation of invariance. [3]


Several types of optimization process can be computing using modified steepest descent algorithms. Such algorithms     known and understood in machine learning, artificial neural network, genetic algorithms, and numerical analysis disciplines. The category, of all steepest descent algorithms [4], includes systems that extract meaningful patterns from unstructured input. In the back plate, as defined by Prueitt, semantic cover generation is maintained at each of many virtual machines. The generators are each one equipped with one of the category of human mediated algorithms where human inspection of results is often required.


The notion of semantic cover is itself a difficult one, but in essence the notion implies a type of minimal sufficiency. The conceptual work for specifying semantic cover generators is founded by Prueitt in an area of formal systems theory called topological logic (Victor Finn, 1982  1994). Prueitt describes this conceptual work in chapter six of the book, Foundations for the Knowledge Sciences. [5] This work relies on stratification and a simplification of the processing architecture so that the logic described in Foundation can be applied directly to input / output relationships.


Semantic Cover Generator


In practical terms, any specific semantic cover generator is defined in terms of sufficiency. Are the elements of the cover sufficient to cover the target?  Sufficiency may be defined as a result of a specific type of utility function. In the evolution of the foundations of Hilbert mathematics the utility function is simply some subjective sense of independence and of completeness.  This subjective sense was, however, refined over the centuries to produce the foundational elements of mathematics. 

A descriptive enumeration is discussed by Prueitt in his demand side theory *<*>
Descriptive enumeration produces a cover over an area of investigation, for example the services that may be required within the electronic exchanges of a large corporation.  The works that Prueitt proposes is that these enumerations may be used to produce transactional memory and an encryption regime.  These resources may then be used to control, or to manage, the system under investigation.  We should be clear that by "system under investigation" in the demand side paradigm this is mostly likely to be the investigation of self by the self, as in long term educational work or own's own understanding of one's health.  It is entirely possible to create an encoding of such personal information so that no one other then the creator will be able to understand the encoding, and any access to the decoding mechanism would signal the creator that an intrusion had been attempted. *<*>

In the back plate we are concerned about the materialization of objects at a distance. Can any object of a certain class be encoded at one node, the compression tokens sent to another node, and the object be generated? If so, distributed as a function of the backplate, we have some form of generative cover over the class of objects. If the cover has a logic that predicts or anticipates the function/behavior of any generated object, then we have some type of semantic cover generator. If the semantic cover may be processed by a specific logic system, the one suggested is call quasi axiomatic theory [6], then a minimization of the cover may be identified. In the simplest form, this architecture is easy to realize, and has been realized in any compressed transmission. What is being suggested merely takes an additional step. 


Even in this simplistic form, there are features related to the back plate model that can be realized. Object ownership in the Second Life virtual community software system already has many of these features. Thus there are actual models of what back plates will produce. The issue we repeat is that current software design is far too complicated, in nature, to realize many of the benefits that will arise once processing architecture shifts to the second school perspective.  Things that are done with great difficulty now will be done with great ease.  Things that may not even be imagined will be developed, but not the strong forms of artificial intelligence.  We give up on this mistaken goal *<*> and develop instead the demand side technology from first principles. *<*> 


Prueitt's architecture for managing all commodity transactions across all national borders represents an early effort at producing the back plate design. [8]


As we began to make an investigation into back plate phenomenon; we found many systems that might be considered to be an optimal architecture, where optimal is formally defined in the context of a utility function employed by a steepest descent algorithm. In all of these systems, data is not just moved about but rather each whole, e.g., object, is treated as an object and encoded using a dictionary. Transmission then is not of the whole object, but rather is a transmission of a linear series of symbols that when expressed in the presence of the dictionary generates the whole object. The required bit transmission can be up to, in theory, 1/700 of the original bit transmission, and would generally be about 1/40 th the size in normal compression.

Structural cover generators


The difference between structural cover generators, such as compression dictionaries, and semantic cover generators is now seen clearly. The compression can be in fact encrypted, and the compression/encryption tokens may have some nature that allows a human knowledge management function.  The encryption/compression regime may be linked to an ontological model, to a utility function and to situational logic of the type envisioned by Mill's, extended by Pospelov and Finn, and completed by Kugler and Prueitt. *<*>  The utility function, ontology and situational logic may be used to optimize a community's satisfaction with the meaning of generated symbols.  Multiple communities, having different notions of coherent may have mediated collective conversation. 

These human knowledge management functions are achieved with the same mechanism created for computer manycore and grid transaction memory.  In current research at Intell Corp the paradigm of software transaction memory is seen as a possible solution to task scheduling in parallel.  This transaction memory based solution to grid processing becomes viable when knowledge management and the design of communication tools are united using structural covers.  In other words. community based, e.g. common people, generativity is enabled by mechanisms supporting true many core and grid parallel process. 

It is important to acknowledge that this additional feature has not been traditionally associated with compression/encryption paradigms. It is also noted that the work by Prueitt and Adi on language generation from onto logical primes may indicate a direct correspondence between topological covers and the every day generation of the contents of mental awareness. [9]


Structural cover generators are commonplace in compression technology but not organized to produce a functioning back plate. This next step provides an organization to the compression dictionaries so that the elements in the compression dictionary may be linked with mechanism that evolve machine based models and the human interaction mechanisms.  The difference between semantic covers and structural covers is not so much a difference that cannot be over come. Architecturally, we do make the case that a backplate with semantic generation capability is physically manifest in brain systems.*<*>


How will it work?


A backplate works by updating all virtual nodes on an on going basis.  Backplate nodes are small virtual engines which MAY be similar in nature to virtual operating system engines.  All transference of information, such as the information needed to provide an intuitive (topic map) interface as well as all structural data will NOT be sent from point to point, rather the information will be generated at one point because at some other point there was some event that "caused" the generation. 

For those that know a little quantum mechanics, the metaphor to Bell's inequality is illustrative.  "No physical theory of local hidden variables can ever reproduce all of the predictions of quantum mechanics."  The backplate mechanism employs stratification theory, and non-locality, to overcome this prediction *<*>
The theory of backplates suggests metaphors between computing systems and natural systems. For example, the quantum layer of physical reality is a theoretical construct. No one has seen this reality, but we leave this for another time. More could be said on this. Various structural issues are delineated, including a backplate model of quantum mechanical phenomenon like Bell's inequality

It is true (2008), we need to demonstrate this in a small lab, but it is essentially something that has been done a long time with regular compression tables.  We just need to show that this is a "backplate" that has generative capability. This generative capability is also to be shown to instrument a digital object management system. The management system will control property rights agreements, security - both local and global, computing processor tasking, as well as human community mechanisms for knowledge generation and information sharing. 

Please note that this architecture is not designed to gain venture funding, rather is being specified as part of foundational science. So the reader should not judge our presentation based on the understanding demonstrated by current practices by the business innovators. Our audience is the community of scholars and individuals who see the necessities involved in shifting control of communication, entertainment and computing activities away from narrowly focused business processes and placing more control in the hands of individual consumers. In the new economic model, scholars may replace entrepreneurs in the IT development function. 

The consequence of federal building and control over a Blank Slate Internet infrastructure is the empowerment of entrepreneurial capitalism.  Public property will replace private property within the communications grid infrastructure, thus enabling new economic sectors based on new layers of private innovation. What is conjectured to be potential can be seen in an analogy.

Suppose that at the very beginning of the automobile age, venture capital focused on owning all paved roads. Suppose further that a monopoly was developed whereby if one wanted to use one's automobile, one had to pay the investor groups for the right to drive on a paved road. If the federal government wanted to build Interstate Highways, the federal government would have to pay excessive amounts of money, perhaps as much as 50% of total construction costs, to the initial investor groups. Suppose, further, that patent law has now established a non-ending ownership to the concept of a paved road.


Now, imagine the specific natures of industry and commerce that would have arisen. Imagine if the initial investor's group governed the political system itself. At one point, the desirability of doing things that could only be imagined, such as we experience every day in our real social reality, would come into conflict with the notion of private ownership of all paved roads. This is the conflict we now see with respect to the information highways.  One group of wealthy and powerful individuals insist on the right to own the roads, and ties this right to a misplaced notion of capitalism as well as to the foundations of democratic governance.  


The shift from first to second school, were it to occur, means a decentralization of control over private information spaces. A new economic model governing information infrastructure would arise. The new model will benefit everyone and democratize the control over commodity production and consumption patterns. A new level playing field will be created, and from this level playing field will arise new economic sectors based on a balanced sense of capital formation. 

The optimality we discuss comes with provable security over information, and thus new business innovation is possible as a consequence of backplate systems. Business does not suffer as a consequence, but is removed as the commanding authority over production and consumption. The demand side is enabled by the communications infrastructure in the same way as the roadways empowered the rise of modern travel and commerce.  Anticipatory technology *<*> creates a replacement for manipulative advertising as a means to control the production.  Production becomes driven by the real time aggregation of the expectation of consumer demands. If the public demands, wastefulness is directly addressed by market forces rather than be driven by market forces.


In order that backplate information systems arise we need to be objective about current information technology sub-optimality. The problems are well recognized.  For example, an ultra stable and provable secure distributed operating system cannot be build on XML, RDF and other current technologies, but can be realized using the backplate concept.

A number of benefits may be immediately pointed to.  For example, the optimality to be gained from a pure backplate might be understood as a feature of distributed computing governed by utility functions. Part of this optimality has to do with an ability to use smaller pipes for digital data movement, or re-location. Small pipe transfer of video transmission is to be realized in wireless environments.


A strong capitalization argument is present in an area where the producers of digital content in the entertainment markets feel great pain. The second school position is; however, that a shift in the markets is required as a consequence of the imbalances arising from industry driven consumerism. The ubiquity of new application potentials requires that a backplate consisting of semantic cover generators be developed by the academic community and made available as un-owned infrastructure. This provision empowers new market strategies.


The optimality of the backplate technology might be properly seen as relative to the non-optimality of the first school technology. Optimality arguments seem possible, but requiring of some reasoning based on category theory and foundational concepts in mathematics and computing theory (Harold Szu, unpublished and communicated to Prueitt 1998; Prueitt, unpublished). A simple demonstration of first principles may be seen in the physical organization of material reality, and thus the corresponding feature is seen as having a number of metaphors

We are suggesting that underlying mechanisms of the current Internet might be slightly modified so that backplate theory would have a means to be implemented as part of a living interaction between computing environments and human communities. This outcome creates also the transparency that is required if the Internet is to become a safe place rather than full of darkness. 


Section II: The second school position on computing environments


Over the past decades, computer science has become grounded on a view of natural science that is reductionistic in nature.  This grounding has created great economic value, but has also been done in such a way as to also create a narrow viewpoint about information.  This viewpoint is maintained in spite of adequate scholarship regarding the differences between data, information and human knowledge (as experienced).

We conjecture that hard limits experienced due to the current information technology may be by-passed by properly aligning computer science to natural science. The capital investments required establishing proof of this by-pass is not as significant as the investment that is required to control, e.g., inhibit, the evolution of a new ecosystem of markets. [11] It is thus possible to predict that at one point there will be a shift from first school paradigm implementations to second school implementations.  It has been impossible to predict when this shift might occur. 


Some background is needed. Let us start with the difference between human knowledge, as experience in real time, and "information" as it exists in a textbook.  Information as defined by Shannon is data oriented and digital in nature. How long does it take to transmit the bits that reproduces the text book at some other point?   We feel that this is actually not a correct formulation of the nature of information, as commonly understood by an average human. An alternative view of information is oriented towards the interpretive process. This viewpoint is perhaps most associated with the work of Charles Sanders Peirce. [12] Consistent with the Peircean viewpoint, human interpretation transforms symbol systems into the contents of awareness.


Using backplate mechanisms, data processing will produce symbol systems that are then either interpreted by humans, systems of humans or systems of living systems.  Collective intelligence *<*> will be instrumented using the demand side mechanisms, as discussed by the second school. *<*>  For example, a fish pond may evolve environmental controls that optimize location and temporal specific production of fish stock, while also minimizing commodity use and pollution.  *<*>  These symbol systems may be designated to produce a context and within that context a mechanism may be used to control other mechanisms. The specification may very well follow a service-oriented standard using principles as given, for example, by Thomas Erl. [13]


The second viewpoint makes other distinctions between the first and second schools. One of these distinctions is about the meaning of the term "complexity". This distinction may be used to see the real and non-removable difference between artificial intelligence and natural intelligence, and between the first school and the second. [14] When this difference is seen, we are given a new and proper understanding of natural intelligence and computing. This understanding is essential for those of us trying to understand what is next?


We must review some elements of foundational elements of computer science. The complete system of mechanisms involved in computing environments is always a system of finite state machines. The first and second schools agree. The disagreement is framed in the question "are natural systems finite state machines?"  In the second school viewpoint these machines can be extremely complicated, but never complex, in the sense that Robert Rosen's work has defined. [15]


A simple reading of the literatures may be used to make the case that Rosen's work in category theory takes off from where Pierce's work ends. Certain common abstract constructions are used by both scholars.  In both Pierce's work and in Rosen's work, a core notion is the notion of an living interpretant.  The interpretation acts on perception as a consequent of being aware of both symbol systems and a range of possible meanings for constructions from these systems. The act of interpretation is part of a chain of many embedded cycles, within cycles, of perception followed by action, as discussed by J.J. Gibson [16] and others. The first school claims that these cycles are ultimately mechanistic. The second school simply asserts that these cycles are not purely mechanistic. There are points of complexity, Rosen complexity, that must be acknowledged by science if we are to understand the nature of human awareness.  


The second school position is absolutely clear and precise.  Complexity is neither simple nor complicated, rather is it an indication of indeterminacy, a quality exhibited by natural systems. A finite state machine, by nature, can never by complex. This is the position that the first school seems not to be able to understand. To see why this understanding seems to be difficult for some people, we need to look at the notion of logical coherence and translatability.

Some things are not understandable unless one comes from the right viewpoint. Self-limitation is constructed from the mechanisms enforcing and experience of viewpoint.  The problem is not that the second school positions are abstract and too difficult for average people to understand.  The problem is that the current ways of thinking are often supply side driven and thus acting to control what may or may not be thought <*>.  The required balance between supply and demand sides also requires an sense of multi-coherence *<*>.


Linguist Benjamin Whorf [17] developed the notion of non-translatability. The concept of complexity is illustrative of the notion of non-translatability. The issue of complexity is not translatable into first school thinking, due in part to the, what we would regard as, polemical definitions of words like intelligence, complexity, free will and others by the first school. However, for those in the second school the understanding that complexity entails causes other than Newtonian causes seems justified by empirical observation and a growing body of scientifically grounded theory.


Can the action-perception cycle be reduced to data and mechanisms? The question must be informed by a first principle based on examination of natural science. The first school is self limiting, suggestive of an artificial notion of intelligence. Natural science can and should examine the mechanisms supporting human consciousness. A careful examination of words such as awareness, living, and interpretation is possible based on natural science. So the path is open to develop a second school of thought about the nature of computing environments.


If nature does not yield completely to reductionism then such an examination should produce computing interfaces that are not now anticipated by the current mainstream of information technology. When a back plate links these interfaces we have new types of social networks.


The first school asserts that one must always keep the question of artificial intelligence open and defined to be a type of intelligence that is superior to human intelligence and having all properties of human intelligence. The second school suggests that first principles derived from natural science provides increasing evidence that natural intelligence has specific properties that are not reducible to data and deterministic mechanism.


Implications of second school thought on human computer design


The human action perception cyclic involves an awareness, of state, in real time and the production of a cognitive map. The awareness is of particulars and the production of cognitive means is a production, via induction and abduction, of universals. First school sees the particular arising from universals. The second school sees the particular as always having some part of its essence not expressible as any set of universals, at least not as human language.


In modern information technology paradigms, such as knowledge management and service event analysis the cognitive map is then used in ways that are reducible to discrete data structure and mechanism in the sense defined by Shannon. In the second school viewpoint, our designs follow an action-perception cyclic in the production of universals from particulars. The mapping symbols may be encoded via the induction of specific symbols and the reification of meaning using a convolution over particulars to produce universals (Mills, Finn, Pospelov, Prueitt). However, this must not wag the tail. The particulars cannot be forced as some expression of a fixed set of pre-defined universals. To do so is to create a top down hierarchical control system that cannot instantly acknowledge novelty. 


Such systems are very utilitarian but are subject to unexpected, and often costly, failures when the natural world changes. These issues were addressed in Soviet era efforts to place control over complex systems; such as cities and social systems, but such control systems were never achieved (Prueitt, Finn and Pospelov  private communications 1997).

Second school computer-human interface design


We have developed a computing paradigm that is designed to achieve a number of features not now available. In particular, a bi-lateral protection of intellectual property is made available. The bi-lateral interface is between the system of a single human, family, social group, or other encapsulated system and another system of the same type.


The interface is complex in the Rosen sense, not because of a technology feature, but because of the natural intelligence of any living systems. To talk about the interface involves the use of concepts that do not exist in the first school, and thus have non-translatability issues in the context of a discussion between first school proponents and seconds school proponents. The context does not mean that the second school does not have language, only that this language is not understood by the first school.


Specific language is used in the complex systems general theory literatures that will be used only very carefully here in the description of interfaces. The back plate interfaces are not reducible to data and mechanism, at least there is not any complete reduction process that has been found and is widely known. Rather than dealing with our language in a way that is consist with the well developed school of thought, we have been calling this school the first school, we use instead the specific language of the second school of thought about the nature of information.


In the first school, we develop the notion of an encapsulated digital object. This can be done completely within the language of finite state machines and what is called by the information technology sector, object oriented programming.


The notion of an encapsulated digital object is further extended to treat the objects as services defined within a computing environment. Service Oriented Computing (Thomas Erl) is then born from the object oriented programming and design literatures and efforts. However, the object seen as a service is still first school in nature, because the distinction between data and information is not clear. Data works with mechanism, deterministic mechanism; and information works with an interpretive process that is not, according to second school first principles, reducible to deterministic mechanism.


Encapsulated Digital Objects and the back plate


The encapsulated digital object (EDO) is a step along a path defined by object oriented programming (see works by Brad Cox [18]) and now service oriented computing. CoreSystem (developed by Sandy Klausner) provides a view down this path, but Klausner has already taken a different direction then those who are in the first school, a direction that takes one directly to the first principles of the second school. The key word here is generative.


To build a market context for the scientific work I am proposing, we need to introduce a bit a jargon. This jargon only roughly approximates what might or might not be actual present in the marketplace, due to non disclosure agreements and classified R&D. CoreSystem relies on the generation of a set of computational primitives and the use of what are called frameworks to generate digital objects. John Sowa, Richard Ballard, and John Zackman all have versions of generative semantic primitives, although a general theory of semantic primitives has not been published. The development of a general theory is attempted in my private work, but the issues have to do with paradigmatic viewpoints and how multiple viewpoints might be represented in a single logical system.


The CoreSystem framework is called cubism and is related to the art history movement called Cubism. [19] The same distinctions made by Thomas Erl regarding the transition between digital objects and digital services (defined within computing environments) can and does get made in the CoreSystem architecture. Other concepts from these that are behind XML, Topic Maps, and web ontology languages (OWL) are present, for example namespaces. However, the implementation of these concepts is at a more advanced and simpler level. The difference in actual software architecture design and implementation features from one software system to the next may be compared with the existence of different geometries. One may be able to represent very different theories of computing science, each one having specific types of features, capabilities and consequences.


The implementation in CoreSystem is second school in nature. Namespaces become contextual devices where the mechanistic data structures are given meaning based on pre-defined contextualization measured by a specific implementation design for a back plate. Context computing, what ever that means, is then the primarily book keeping task of CoreSystem.


So what might context computing properly mean? This bookkeeping is accomplished using an induction of form from the experience of structure. The bookkeeping provides the content for semantic cover generators and cause an induction of universals from particulars, in second school; and a use of pre-defined universal to represent particulars in the first school.


The second school's intent is to observe and to produce universals from direct observation. This is not an easy task, unless first principles are actually aligned to real natures and to the reality of natural systems. These first principles include the recognition of location and individuality. The computing theory that we are suggesting makes as an assertion the need for real time involvement by everyday users in the behaviors of the software code.


We assert that location has a local and distributed nature, and this nature can be accounted for using stratification. Of course this assertion is a core second school assertion about physical and cognitive reality. The first principles related to stratification can be understood as necessary if complexity is to be acknowledged. The stratification allows symbols system to evolve within a number of layers, layers that are tied together by what the Soviets called Mill's logic and what Prueitt and Kugler extended (1996  1998) to produce a tri-level architecture for treating the particular to universal induction. [20]


A stratification of symbol systems also creates the means to generate service objects for provably optimal transmission using compression/encryption dictionaries. Mathematics for provably optimal security over the key encoding to a service object is given in private work, but an outline can be given here. The bottom, or substructural, layer roughly corresponds to physical atoms, which are then aggregated together to produce compounds. [21] These atoms are found using stochastic means. The compounds are then the digital objects that de-materialize at one place and re-materialize at a different place. The generation process is thus seen to use a table of atoms, to compress [22] the signal thus producing gains to effective transmission rates. The categories of data transmission compression patents all work on precisely this principle; however without there being a purposeful refinement of the atoms into what is in essence period tables for the expression of semantics. This is done by CoreSystem and by the Mark III system [23] as well as a number of classified systems.


A number of practical advantages are derived from second school first principles. The generation of services from service objects can be instrumented so that all generation events are communicated to an internal data structure (based on a key-less hash (Paul Prueitt)) that then must communicate to a service organization as a means to manage bi-lateral issues with respect to intellectual property. This communication and instrumentation is described in Brad Cox's book SuperDistribution as occurring within a micro-banking system. The generated service is not communicated, only agreed on data is transmitted about the use of the generative service object. The service fulfillment then uses the semantic cover generators.


The technology is then a generative technology having optimal compression and encryption, as well as an evolutionary architecture that creates stable substructural tables having great expressive capability. Individual communities, or processes, may evolve distinct substructural tables for which non-translatability becomes an issue. However, the reality of non-translatability requires a feature supporting terminological reconciliation technology [24] .


On the usability of systems that have a back plate


The picture of interacting gEDO (generative Encapsulated Generative Objects) systems, having well defined interfaces to actual living systems is a picture that can be understood, because this picture is in fact similar to how humans use natural language.


The argument is simple. We use our familiarity with life to understand what cannot be formalized into data and mechanism. We produce second school words and meanings that communicate this familiarity, not by placing all of the knowledge into a symbol set, but by evoking shared awareness.


The language is not reduced to data and mechanism, so the part of this picture that is hardest to understand is the computing environment. The computing environment is vastly simplified into provably optimal compression and encryption; as well as both localized and global gEDO management environments. These environments each have bi-lateral and uni-lateral capability, and these capabilities produce information security as well as intellectual property management.

Section IV: History


In November 2007, a group of information scientists made the observation that a back-plate to the Internet is emerging, and predicted that this phenomenon will foster a new economic model. A precursor Internet back plate is, in fact, evolving via a collective process having no central control, with strengths analogous to the wiki concept. As in other collective and distributed processes, there is an adaptation due to a certain set of principles in order to meet anticipation.


This adaptation is not governed by centralized control. However, where the adaptation is going may now be visible. Economic decisions regarding approaches funded by government and private sources have been involved in the development of the Internet. Some less than optimal work is to be expected; we are in a trial and error phase in the development of the Semantic Web. This sub optimal work is now easily recognized. Optimality is defined with respect to some viewpoint and viewpoints shift, sometimes suddenly. Because the difficulties involved in software use, and the failure of the current system of software development; one can predict this shift in the market in the near future. This prediction cannot be precise, but is based on long standing theory in social sciences and in economic science that an established system that become disconnected from reality will produce an appearance of reality only for a certain period of time.


To see how the collective effort is progressing, we may focus on economic motivation and how this motivation supports, or inhibits, inventions of precursor technology such as semantic extraction, generative-encryption, ontological modeling and the like. Such a focus provides insights into how to capitalize on one or more element of the emerging phenomenon, and in this way more fully participate in the rewards. Because there is a potential shift in the economic model, these precursor technologies may be understood by anyone wishing to invest successfully. Investment however creates its own reality, and thus the success will likely be seen by those who are lucky and whose insights have suggested caution.


Many large-scale projects have anticipated the back-plate, including some in patent evaluation, pharmaceutical and medical literature identification, medical research, drug design and manufacture. In each case, underlying precursor technology is used and used in a way that shows similarity in how the technology is conceived and deployed in other projects. These projects have not produced the critical mass required to break down the old computer science paradigm. We are still, January 2008, in a pre-shift era.


The reason that IP mapping (patent evaluation) would become important was clear sixteen years ago in 1992 when the author gave a talk at a private conference at Georgetown University. The talk covered mapping IP evolution and potential technical means available to automate the communication of IP evolution between the university and the marketplace. These concepts led to the BCNGroup Charter mechanism (1994), for mapping IP and distributing the compensation for university based research. Later, in 1996, associated concepts combined with Brad Cox's concept of super-Distribution as a means to provide transparency for the IP universe. The original architectures in 1993 involved neural models of cognitive behavior; like selective attention and orientation. Over the years, the concept embraced work derived from Soviet cybernetics and semiotics (1995-1998); and then ontological modeling (1999-2004). In spite of this history, and similar histories involving other innovators, IP mapping is still misused and not performed in any optimal fashion.


Evidence suggests that this evolution of concepts and related technology is similar to many other projects, some highly funded in private or classified settings; but none clearly visible. In the material presented below, the author attempts to suggest to the reader some of the core principles on which these concepts and technologies have depended. We leave side the questions related to the critical mass and the shift.

Ontology emergence and merging


We observe that human interpretation of linguistic patterns is highly situational, and based on context. Nothing technical about this, but the underlying mechanisms have been a mystery until recently. The natural properties involved in memory, awareness and anticipation suggests a mediation of linguistic parsing by dynamic ontology. These properties have been lifted into an abstraction and realized as computational process, all the while fully understanding that the computational system is neither intelligent nor alive, but is merely an extension of natural processes involved in natural intelligence. The mediation fits each situation with relationships and associations that are semi-automatically constructed.


A new school of thought about human information exchanges has been born, and is called the second school. [25] A model is suggested that allows computational support for a natural process, of which we are all quite familiar with.


Our approach merges ontology from past analysis with a type of category theory that applies nuances of nouns, verbs and objects. Taxonomies, controlled vocabularies and web ontology are soon to provide easily understood situational analysis of particulars. The key is that the technology allows normal action perception cycles as humans interact with the computer. Humans do use selective attention and orientation to features as an interpretation and modification behavior. Modification engages the human and results in high quality learning. These principles have computing correlates.

On universals and particulars


The most critical feature of the new environments is in getting an ontological model reified from particulars; e.g. universals from particulars. These universals are seen as "not being everywhere realized".


A short discussion about particulars and universals starts with a question. In the moment, what time scale are we in?


Is this even a reasonable question? For reasons that appear hidden, the nature of the particular and its composition from elements of the universal has been a subject of inquiry in all civilizations and in all times. The current time is not an exception.


The consequences of the investigation, in our times, results in almost every type of belief system and in every form of our science, in what ever system of science one inspects.


As in string theory, there may be more than one conceptual system that accounts well for the phenomenon that manifest in the various scales of physical, biological and social activities. Also like string theory, the development of evidence about biological and social science may only now just begin to be available. The condition of non-translatability may be expected to separate any one of these systems of thought from each other. One can have the position that non-translatability has something to do with a failure to find the set of universals that apply to everything. On the other hand, one may take the position that human knowledge always has an illusionary nature, and then very timidly suggest that non-translatability between human conceptual systems has a non-removable truth. The paradox in this timid statement escapes no one's attention.


In the language of systems theory, we may say that the expression of ontology in time is or appears to be fractal in nature. What this means in pure mathematics is precise in works on scholars like Mandelbrot. What it means to me is that the particular is attempting to expression in the patterns that have formed at slower time scales, and is being required to make that expression with patterns expressed in the fast times scales. The particular is sandwiched between universals at two different scales of expression.


Lines of affordance form and create an event horizon with the present moment appearing to be in the center.


There is a contextual frame to abstracted ontology. Context can be managed using terminological reconciliation and in fact a kind of terminological science that Fiona Citkin pioneered in the Soviet Union (late 1980s and early 1900s, mostly classified). It is quite natural to realize that terminological context is ultimately determined by a consensual relationship between individual humans and the collective agreement. In both the individual and the collective cases, explanatory coherence, i.e. rationality, is involved (Paul Thagard).


A model that corresponds to natural process


Our previous work integrates the above principle into a tri-level architecture designed to reify (create universals from particulars) in real time. The resulting ontological models are complex, in the sense that Rosen defined. Rosen defined simple as any system that had a formal nature similar to Hilbert mathematics. Much of the mechanisms of reality are well modeled by Hilbert mathematics, but a significant number of the mechanisms involved in human knowledge are not. For creating knowledge about these mechanisms we may need to use ontological modeling. The reason may be simple. The evolution of these systems involves the emergence of wholes whose function fits into larger ecosystems of processes. The function can be achieved in many ways, using many different groups of compositional elements. Recognition, processing of stimulus using some internal model, and intentionality seem to be involved even in metabolic processes.


The tri-level architecture is designed to create models of complex phenomenon. However, the principles involved in the design of the tri-level have to be justified based on some type of verification principles and on consistency with classical science, for example Pribram's work on neuro architecture.


The three levels are each composed of a set of abstractions. The lower level is a set of semantic primitives, defined statistically and heuristically as semantic frameworks. The upper level is a set of categories, whose definition is a consequence of prior description of how things evolve. Situational parsers measure the lower category structure and update semantic cover generators. The upper level has to be constructed by some means, and for this we suggest the Mill's logic (Prueitt, 1996).


Once this tri-level architecture is seen and it has been on the OntologyStream web since the late 1990s, the programming is simple (four months). Within this period of time we will produce representational knowledge via topic maps over the IP space and display these as mind or cognitive graphs. This production capability can be demonstrated in a localized environment in which compression and generative objects architecture is realized.


New technical means, the n-ary ontological model


The development unfolds in natural steps. These include the automation of linkage between well-specified n-ary ontological models and the process of measuring the categories of invariance in data comprising video, textual or other types. We are speaking of categories in invariance, in video and structured data, that parallel linguistic categories. Private work indicates ways to place these categories of invariance into automated processes that produce representational systems about the underlying substructural natures and their possible linkages to behavior.

Section IV: The Back-Plate and Digital Rights Management


The proposed architecture creates a back-plate for the analysis of systems of information, these systems having a localization aspect and a distributed aspect. In pure back-plate architecture all information exchanges are restricted to compressed and encrypted objects. Such a system provides a 100% Digital Rights Management solution, simply as a by-product of the back-plate based management of informational objects. The objects, somewhat like waves and particles in physics, have some specific features that may be used to settle some hard problem. A back-plate is also simpler in nature than current approaches to service oriented architecture.

The language of compression


There is an opportunity to do something unexpectedly simple with compression/encryption dictionaries.


Any dictionary is composed of a set of ordered pairs;


G = { (c,u) }


where c (the word) stands in for u (the definition), c and u are bit patterns and statistically the transmission of a linear string composed of occurrences of c elements can be done with fewer bits than the corresponding occurrences of u elements.  The trick that we achieve with "ontology mediated digital encryption" is that the compression token be a universal and that the uncompressed tokens be "parts" of particulars. We realize that this problem is the same as the problem of relating linguistic category to semantic category. We also see parallels to work done by Klausner, CoreSystem, and Ballard, Mark II design.


My concept of a knowledge-operating environment has a back-plate having remarkable properties including the ability to create very secure private information spaces. Once the dictionaries have an ontological nature to them, placing a small reporting loop within the code instruments the generation of the uncompressed/decrypted digital object. The reporting of use then has an ability to act in enforcing property agreements.

The measurement of categorical invariance in data


The tri-level architecture needs to have input of a nature similar to what we find in cyber security, economic data or textual data. We are encouraged by the increasing success of parsers that perform text based semantic extraction. The author has long talked about the generalization of specific software into a general-purpose toolbox for building measurement devices that extract and then for managing the invariance in data streams.


The general-purpose toolbox serves to create what are essentially compression tables that store the patterns of bit occurrences, zeros and ones, and then to discover a substructural to functional relationship between patterns and functions of composites of these patterns. Extracting patterns is precisely what compression algorithms do. Predicting functional behavior from substructural nature is what the Mill's logic does (private work by Prueitt and Kugler 1997). This predictive function is essential to back-plate architecture. The cubism framework is used by Klausner to give predictability and uniformity to CoreSystem's back-plate. The extensive specification of services with CoreSystem is perhaps the most highly evolved architecture within which a back-plate would be consistent. Other service oriented computing environments are being developed using design principles, but without the concept of back-plate generation and enfolding.

Measurement is followed by encoding of data


A second order compression is possible if the elements of the extracted set of patterns are assigned categories. By analogy, words are categorized as nouns, subjects, verbs, objects etc; and some type of logic is placed so that the information space is represented by a smaller set of categorical elements. The standard for doing this is the Topic Map standard, and the visual interface to Topic Maps is the now popular mind map graphical user interface.


The Pile theory of Peter Krieg [26] gives us a natural way to hang second order compression into data lattices. The elements of the first order compression table are arranged into lattices that are self referential, thus providing the additional compression.


This compression is not a compression of data only, but may become a compression of information under certain circumstances. These circumstances have been explored by several research efforts. We are now ready to create the virtual machine that does the kind of encryption we are talking about.


If the compression is in terms of elements that are present in the compression of data then an information retrieval paradigm is possible. Again, the correspondence between second school concepts and natural science is illustrative. There is good evidence that the human brain stores the invariance of experience in precisely this way, and that human memory recall is via a mechanism of the type we are proposing, and which has been proposed by others. The underlying index may have the form of the key-less hash table (Gruenwald, 1998; Prueitt, 2003). Karl Pribram's work has suggested a provably optimal operating system for digital environments, which is consistent with our back-plate apparatus. The key-less hash is a natural and simple means to implement architecture corresponding to Pribram's holonomic theory of brain function. [27]

A new retrieval and organizing principle based on Mill's logic


It is not necessary that the representation of information be smaller than the original information. In fact, search starts with a small representation of information and produces a larger representation. Once this is understood, it is possible to have an apparatus for retrieving information where the apparatus has an internal instrumentation that allows the use of the apparatus to have recorded consequences.


The Mill's logic was extended by Soviet era cybernetics to establish plausible evidence that behavioral elements in compressed information are composed of, or can be composed of, specific sets and arrangements of structural invariance, i.e. the elements in compression or encryption dictionaries. Stated in a different way, the set of compression/encryption tokens may be found so that a digital object can be expressed as a string composed from the set { c }. We are lucky here that the digital object has a structural solution and that certain features of our program will not depend on a semantic alignment between the compression dictionary and meaning that might be given based on category in the form of ontology.


Several innovators envision co-evolution of behavior annotation, with compression tokens, in slightly different ways. There are bookkeeping and inferential aspects to applying the Mill's logic to predicting behavior by identifying substructural invariance.


The fractal nature of information


Fractal compression/encryption differs in some respects, but shares a categorical correspondence. This categorical similarity suggests ways to automate semantic linkage within a system that manages compression dictionaries.


A modeling language is needed at each processing node so that any model can be expressed as a composition of the elements of the language.  This concept is the essence of CoreSystem, for example.


The nature of fractals and the nature of natural expression has only started to be explored. One is struck; however, with the idea that is biological expression is subject to self similarity at different time scales, then one should be able to develop a type of anticipatory algorithm, having real time measurement input. This might be applied to anticipating what products a market will want.


The new model based on secure IP management


The MyBank mini-transaction accounting system that Brad Cox developed (1994-1999) using J2EE is a back-plate composed from many individual apparati. We generalize this back-plate concept so that apparati become generative measurement devices. Information comes to exist on three levels. A substructure and an ultra-structure level shape the emergence of interpretation by the humans involved.


The generalized apparati can "unfold" and ‘express" in precisely the same way as David Bohm talks about implicate order. The universal is unfolded through the use of a compression table. Then an apparatus produces a manifestation of the object at some other place. The transmission pipes may be very small.


The manifestation is instrumented and can be measured by stakeholders to the digital properties involved. Reading this measurement requires some understanding of intellectual property issues. Certain trends in aggregating intellectual property are seen in the Creative Commons standards. We recommend the Creative Common standard. The enfolding process can be instrumented also and used to provide instrumental means over demand oriented consumer markets.


The issues are practical and simple.  The language that we need references an "enfolding" of digital objects into some subset of substructural elements { (c,u) } where { c } is a subset of a "generation apparatus".  The generative apparati are all part of the back-plate to a mechanism that records all uses of any information that is so "enfolded", thus providing the 100% solution to the Digital Rights Management concern.

Section V. Bi-lateral Intellectual Property Management


Systems theory is the proper foundation for the management of digital object exchanges. For example, a service may be seen as a type of exchange between systems. Services can then be defined within an ecosystem of interacting systems. What is needed is an everywhere existing infrastructure that is neutral and has properties that provide optimal transmission with provable security. These properties will serve many purposes, but perhaps none as valuable as the bi-lateral management of intellectual property. Bi-lateral management is between separate entities, and thus the underlying mechanism supporting this management system will reflect natural, and social, reality.


The generation of substructural patterns, in data exchanges, can be seen to produce a means for universal expression similar to the universal expressive power of human phonetics. With a small set of sounds, the spoken language can be used to express almost any type of human communication. Given any one of several methods, the generation of a small set of substructural patterns are expressed in the compression / encryption dictionaries (as seen in CoreTalk, Mark III, and other clean slate Internet system designs). The generation process itself involves the convolution [28] over many instances of events that occur to systems interacting with other systems. Storing the results from convolution mechanisms produces a genealogy over symbol systems (seen in both the Mark III and CoreSystem proto-types).


As in the genealogy that likely produces phonetic expression, similarity of parts over many instances is a key mechanism. Also, as in the genealogy of phonetic expression, the genealogy of these sets of substructural patterns can evolve to accommodate shifts in the intentionality of the communities involved. The development of generative information technology having genealogy has application to the scientific understanding of cell and gene expression. This fact exposes an additional market sector where back plate systems might empower new types of markets.


Rosetta net is an early example of how this might work. CoreSystem is an advanced example of how this might work, as is the Mark III developed by Richard Ballard's group. A fixed framework such as the Zackman framework has a non-evolutionary set of generative capabilities. The evolution of the generative set, e.g., a substructure, and the generation of expression are subject of many works, some of which we are familiar with.


The generic back plate mechanism is itself simple. A convolution is the mechanism by which particulars generate universals. The issue is that the nature of induction, the generation of meaning and the assignment of meaning to symbols sets, has some subtle qualities. How one treats these qualities ends up effecting the agility and usefulness of a service oriented environment.


So again we reflect on the systems theory approach to a provision of a new infrastructure for service definition in the Internet. Service definition is then seen as an orchestration of a generative process involving universals. The notion of convolution may be used to create an induction of symbol systems at three levels of organization, the middle being the event space of services.


Systems theory can then be seen as a stratification theory (Prueitt). These three levels roughly correspond to human memory, awareness and anticipation. It is true that systems theory of this type is considered beyond the average person's ability to understand; however, the behavior of a system properly based on a deep understanding of systems theory will function in a way that is familiar to any human. Thus the theoretical language needs not to be understood by the market. We only need to have products that do new kinds of things.


Intellectual property management is often seen only as something that is properly the concern of the producers of entertainment media designed for mass markets. The generative encapsulated digital object provides a complete solution to the current set of problems for owners of mass distributed intellectual product such as movies or audio files. The way the solution is provided is through the use of a generative object that has a back-office banking system for micro-transactions (Brad Cox). The concept is called SuperDistribution and is described in Dr Cox's book of the same name.


However, there is another side to the intellectual property management concern. Individuals wish to be allowed to actually use the objects for which payment has occurred. Microsoft bundling acted against this concern by requiring the bundled purchase of many products even if one wanted only to use a few. We also see that a purchase of the right to listen to an audio file does not automatically mean that the file will play on the device of choice. A new generation of wireless high definition devices will require a common transmission standard. Such a standard means device independence.


The need to control one's intellectual property might also extend to whom the sender wishes to communicate by e-mail. E-mail can have the property that any attempt to read it by persons not authorized will result in the destruction of the e-mail.


The control over one's information space is even more interesting. In the current markets one has very little control over what kinds of information one may encounter. For example, the Internet is filled with objectionable materials as measured with any of a number of viewpoints. Consumers want selective attention to some things and not to others. For example, a scholar may wish to have a stream of objects with information about certain fields of study. We see this type of service being developed with the RSS feeds.


The clean slate Internet will support point-to-point transmission of gEDOs (generative Encapsulated Digital Objects), and nothing else. These objects will each have a high degree of encryption and compression as well as shared substructural (encryption/compression) dictionaries. The shared dictionaries will be composed of sets of data patterns associated with iconic forms that are viewable by humans and to which humans can assign behavioral properties.


The bi-lateral nature of protection for information generated by a single human, or an organization, may have vulnerabilities. This is a question that Stephenson and Paul have been working on for some time. [29]The question is left open for now. However, legal protection exists that should overlay the clean slate Internet. Some discussions about how this protection might be provided will have to deal with the types of collective intelligence seen from organizations.


There are national security issues, as well as the issues related to the Constitutional protection of basic right to privacy and liberty. The paradigm that is coming is one where digital objects always have owners and owners always have an ability to control well defined and agreed on licenses. The concept of ownership is made simpler by treating all of the issues that come from the nature of agreements about rights. The right of ownership is checked by judicial review in cases where some violation of social agreement is reasonably conjectured.



[1] Prueitt, Paul S (2005) Global Information Framework and Knowledge Management. URL:

[2] Prueitt, Paul S (2007) White Paper Resilience Project. URL:

[3] Prueitt, Paul S. (December 2007) Private document: Research Program.

[4] Gradient descent and steepest descent definition usng wiki: URL

[5] Prueitt, Paul S (web publication) URL:

[6] Finn, Victor (1996a). Plausible Reasoning of JSM-type for Open Domains. In the proceedings of the Workshop on Control Mechanisms for Complex Systems: Issues of Measurement and Semiotic Analysis: 8-12 Dec. 1996

[7] CoreTalk URL :

[8] Prueitt, Paul S (2004)  Global Information Framework and Knowledge Management, URL:

[9] Prueitt, Paul S (Oct 2004) See citations in Developing Anticipatory Responses from Thematic Analysis of Social Discourse, proposal to ARDA.


[10] Physicist John Bell: URL :

[11] Approximately 20 M, and 18 to 24 months, is needed for the technology development and test deployment. The first round of about 2M in expenditures is being sought. There are several potential test environments. After the first successful test deployment, a figure of around 200 M is envisioned to capitalize the intellectual property and branding language. This is a business activity, which the author will not address. His role is as senior architect, and senior scientific consultant. His purpose is to clearly define the paradigm, using his own work and his working knowledge of around 20 others  all of who are also consulting scientists  reporting to him.

[12] Peirce, C. S. wiki definition: URL:

[13] Erl, Thomas (2005) Service-Oriented Architecture. Prentice Hall

[14] The Second Scool web site: URL

[15] Rosen, Robert Wiki definition URL:

[16] Gibson J. J. URL:

[17] Benjamin Whorf: Wiki:

[18] Cox, Brad (1991) (1991) Object Oriented Programming: An Evolutionary Approach. Addison Wesley

[19] Cubism: Wiki definition: URL

[20] Prueitt, Paul S (1999): Interpretation of the logic of J. S. Mill, in Foundations for Knowledge Science in the Twenty-first Century on line book by Prueitt. URL:

[21] Unified Logical Vision of C. S. Peirce

[22] As in so many other cases, the language is actually misleading. The digital object produces a seed that is then moved to a new location and the object grown in the new location. In biology the reproduction of the phenotype using seeds depends on there being a genotype and an environment in which phenotypes are the particular expression of genotype in a specific environment and place.

[23] Ballard, Richard. The design of the Mark III is only partially public. URL:

[24] Schema logic as seen in the SchemaLogic software suite developed by Brianna Anderson URL:




[25] Second School URL:

[26] Krieg, Peter. The Pile System architecture.

[27] Pribram, K. H. (1991). Brain and Perception: Holonomy and Structure in Figural Processing. Hillsdale, NJ: Lawrence Erlbaum Associates

[28] Convolution: wiki definition: URL

[29] Prueitt, Paul and Peter Stephenson. "Towards a Theory of Cyber Attack Mechanics." First IFIP 11.9 Digital Forensics Conference. Orlando, FL, 2005.