Stratification and the Measurement of Non-locality
In spite of the title, this paper is designed to address the issue of real time measurement of events without a technical discussion. The focus is historical and appeals to concepts that are well known. From our school days we all are aware of the concept that for every action there is an equal and opposite reaction. There are other essential concepts from classical science that every adult is aware of. These concepts are shared in common as part of what the philosopher of science, Paul Churchland, refers to a “folk psychology”. Churchland points out that our folk psychology influences the nature of science as well as the direction of public funding.
Since the early part of the 1900s there have been developments in physics, foundations of mathematics, biomathematics, and theories of interacting systems. These developments have been paradoxical. On the one hand twentieth century science and engineering reaffirms Newtonian laws and the universals of Hilbert mathematics. On the other hand we find that electromagnetic and quantum field expression creates effects that are provably non-Newtonian in nature. Science also finds that the academic fields of social science and psychology have a hard time reducing observations about human behavior to Newtonian and Hilbert formalisms. In the early 1960’s, a school of biomathematics at University of Chicago was formed. Isolated work on field mechanics suggested models of reality that are both locally focused and non-locally focused. The dynamics of complex interacting systems came into some academic circles. However, funding was hard to obtain. We conjecture that funding problems has been due to the commonly held notions that Professor Churchland talks about.
Social and cultural events have undergone radical upheavals since the early 1960s. Part of the consequences of American national response to the Cold War, and to the War on Terrorism, was to focus national intelligence technology on “what works” programmatically. There was the excitement about cybernetic in the 1960s, and then expert systems in the 1970s and 1980s, and then full artificial intelligence. In AI the assertion is made that the computer program could replace human intelligence almost completely. Smart bombs and distributed intelligent agents and agencies appeared to confirm a Newtonian architecture for information science.
What became apparent only over a period of several decades was that tools based on foundations of artificial intelligence in Newtonian concepts were not getting better as a function of the amount of time passing and government expenditures.
The claim is that non-locality is a real part of the physical world but cannot be completely modeled using Newtonian concepts. The claim was made strongly by Sir Roger Penrose in his 1987 and 1992 books, on the brain and computation. History shows that Penrose’s argument was strongly belittled by a certain group of powerful academics. Most of the program managers in military agencies followed the demands by these academics to not fund alternatives. This rejection of any form of non-locality has been a very strong rejection. However, the issues raised by biomathematics and by the new ontological models of social processes, cell signal pathway expression and gene expression continue to suggest that any modeling based solely on the localized nature of Newtonian concepts will not model critical aspects. Social expression, such as the desire to conduct terrorism seems beyond what is commonly funded by program managers at the military research agencies, In-Q-Tel, NIST, NSF and DARPA.
My work has been proposed and nearly accepted in seven major proposals between 1999 and 2004. The work is grounded in published scientific work that treats causation as having non-local effects as well as localized mechanics. The term that I developed is “stratified ontology”. In stratified ontology it is necessary to have knowledge of characteristic behaviors of the expressions occurring at several organizational levels (of local interactions). It is simple. Reaction mechanics, Newtonian like in nature, is always part of the set of causes of any expression. However during special times, such as when human decisions play a strong role, the emergence of new causes come from outside of the action-equals-reaction model.
Stratified ontology involves the use of a field of formal investigation called quasi axiomatic theory, developed as part of the Soviet area cybernetic school in the former USSR; as well as certain machine algorithms referred to as semantic extraction. Whereas objective review will show that stratified ontology has a well worked out formal framework and is well grounded in published scientific literatures; this work has never received any funding.
Your consideration of this matter is appreciated.
Paul S Prueitt
PhD (pure and applied mathematics; quantum cognitive neuroscience)