Report at the "Summer University Complex Systems course 2001", Central European University, Budapest organized by Santa Fe Institute, Santa Fe, New Mexico, USA.
Evolution, Cybernetics and Philosophical Theory of Knowledge
"A cybernetic system makes predictions in order to achieve certain goals, ultimately survival and proliferation. True knowledge is an instrument of survival. Knowledge is power."
From its very beginning the evolutionary theory of Charles Darwin (1809 1882) had an impact on philosophical investigation of human cognition as a great science achievement. Perhaps, the first who used evolutionary ideas in his philosophical framework was Herbert Spenser (1820 1903). We could also see attempts to deal with evolutionary approaches in philosophy in the works of Zimmel and Baldwin at that time. In the 20th century the significant investigations on the application of evolutionary approach to the problems of epistemology have began. This methodology has been reflected in the works of Conrad Lorenc, Jan Piaget, Karl Popper, Donald Campbell and Stiven Tulmin.
The initial provisions of the evolutionary approach to the epistemology were based on the fact that the knowledge gaining appears to be continuation and therefore is analogy to the process of biological evolution. Thus, the capability of the knowledge to solve the problems faced is the main criteria of the true knowledge similar to the adaptation of the particular species to the environment. The non-existence of absolutely true knowledge is resulted from the above-mentioned provisions. As there does not exist any species ideally adapted to the environment does there is no perfect knowledge which is able to solve entirely the given problem. Therefore, the use of evolutionary ideas makes the epistemology more pragmatic and fallibilistic in a definite sense. The objectivity is achieved through selecting adaptive knowledge, which ensures the solution of the problem and is chosen in interaction with environment.
On the other hand, the appearance of such universal scientific approaches as General Systems Theory and Cybernetics, associated with the names of Ludwig von Bertalanfy, Bawlding and Norbert Wiener, gave rise to attempts of applying system and cybernetic approaches to philosophy of knowledge. Owing to the possibility of analyzing the phenomena of any nature, high level of abstraction and capability of formalization, the use of tools of mentioned sciences is rather perspective.
From the cybernetics viewpoint the world appears to be a collection of interacted control systems, and each of them has a goal, which they are directed to attain. The subject of knowledge itself is one of such systems and comprehends the surrounding reality in the light of its own purposes and internal model of the world.
What is knowledge?
The history of the notion of knowledge from Greeks to our time shows that static theories have been gradually replacing by dynamic, evolving ones. So, Plato assumed that knowledge is represented as the world of absolute and universal ideas and forms existing separately from the cognizing subject. Aristotle had the same point of view and worked out empiric and logical methods of gaining knowledge. Starting from the Renaissance we can found two main epistemological ideas such as empiricism, which sees the knowledge as a result of sensory perception, and realism, which sees the knowledge as a reflection of the surrounding world.
Nowadays almost in every experimental science getting of some results is viewed from the positions, which are similar to empiricism. Most people in every day life also have such style of thinking, aware or not. This approach to the cognition is called reflection-correspondence theory. In accordance with this theory knowledge is a passive reflection of the world. The mind is seen as some storage of images of external objects. Such images are obtained through the sensory input. So the knowledge does not exist as the Platos ideas and is received by the subject through the observation of certain phenomena. It is believed that the world teaches us, and that the mind is similar to the camera which just fixes objects and events of the external world. It is also assumed that images gained through this process are identical, in the same sense, to the really existing objects. In the most perfect case our knowledge about phenomenon might be full or absolute but in practice this cant be done due to measurement errors.
The next step after naive reflection-correspondence theory was Kantian syntheses of empiricism and realism. According to the Kant, we have definite a priori categories in our minds, which form the basis for cognition of surrounding reality. These categories couldnt be achieved in any experience and given to us from the beginning. The subject may get new knowledge using such categories as space, time, objectivity and causality.
At the rise of 20th century a pragmatic view on epistemology appeared in philosophy. We can found features of pragmatism in logical positivism, conventionalism, and the "Copenhagen interpretation" of quantum mechanics. Pragmatic epistemology sees knowledge as a set of theories or models each representing description of some phenomenon or the class of phenomena, and intended for solving certain problems. Classes of phenomena are described by different theories, so they may overlap and correspondingly we may have contradictions between them. We can also propose a number of ways to solve given problem. Then the criterion of knowledge quality may be relying on how easy we can get successful answer for the problem using given knowledge. Pragmatic theories (models) have to agree to the following criteria:
- they should return correct (verifiable) predictions of behavior of the observable phenomenon otherwise it will be impossible to solve tasks coupled with it;
- another requirement for theories is to be as simple as possible.
Absolute reality is nonsense from the point of pragmatic philosophy; we could only have some particular models of the world.
Pragmatic approach gives no answer where the models come from. Usually, in pragmatic theories it is assumed that new knowledge is being built through the combination of pieces of already existing knowledge with the trial and error method complemented by some heuristic or intuition.
One way to deal with this problem is offered by the constructivism, which is based on Kantian syntheses of empiricism and realism. The main notions of radical constructivism may be expressed by the following two theses :
Constructivism especially in its radical forms maintains that knowledge is built by the subject. Therefore there are no universal categories, objects or structures in our mind a priori and also no objective empirical experience or facts exist. The notion of reflection of reality in the mind is impossible in the framework of constructivism. The knowledge has a weak coupling with reality and also has a strong dependence on the subject. These all lead to relativism or, in other words, to equivalence of different models for the given phenomenon from the subjects point of view. There are no criteria to distinguish between true or adequate and false knowledge or to compare a number of alternative theories of phenomenon.
Two main approaches in constructivism could be found, which used to resolve the problem of knowledge relativity. The first suggests to use the coherence of the new knowledge with the knowledge already owned by the subject, as the criteria of knowledge value. If it is impossible to include the new knowledge in the existing worldview then such knowledge is ignored. On the other hand if we can build on our previous knowledge with the new one we always do. The second approach, which we can call social constructivism, perceives society as the criterion for knowledge selection. It proposes that knowledge is being developed through the communication in community on the basis of notions, which exist in society. It is obvious that construction of knowledge in our society takes place both on the individual and social level therefore it is reasonable to use both criteria at the same time. In addition to the above mentioned it is possible to propose a number of knowledge selection criteria that can help us measure its reliability, for example :
Constructivism is focused on knowledge in its relation to subject and reality but it makes no clear sense about the process of new knowledge origin. So we can use the method suggested by evolutionary epistemology to deal with this problem. Evolutionary epistemology tries to analyze rise of novel knowledge based on the Darwinian theory of evolution. It implies that the subject (or group of subjects) is forced to construct new knowledge to adapt to environment in the broad sense. In the evolutionary epistemology framework knowledge is built up through blind variation and then undergoes selection under pressure of internal (e.g. coherence) or external (e.g. experimental data) factors. It is also remarkable that from such viewpoint we can consider processes of knowledge development on the different levels of organization from organismic (formation of new organs in biological speciation) to social (forming of new scientific theories) in the similar way. We see that external world returns in play and knowledge again becomes grounded in surrounding reality.
Currently, the most recent and rather interesting branch of evolutionary epistemology is memetics [6, 9]. In memetics pieces of knowledge represented as an autonomous entities detached from the subject. Such an entities are called memes. Thus we can interpret scientific theories, religions or fashion as some kinds of memes. So, if we start exploring our mind we would see that it is medium filled by particular memes. When we communicate with each other we exchange by copies of memes (memes replication) but there is always some probability of error occurrence exists during the communication, which can be seen as memes mutation. It is natural that the memes (or ideas) with highest replication rate and tolerance to mutations (or meaning distortions) will dominate in society. Memetics allows us to study properties of knowledge in abstraction from cognizing subject and gives opportunity to find some general traits of knowledge development.
Lets take a more deep insight in the topic of evolutionary epistemology.
Karl Popper who was one of the famous philosophers of the 20th century made the first deep and systematic appliance of evolutionary ideas in the philosophical theory of knowledge.
What is the basis of evolutionary epistemology following Popper? It consists of to main provisions . The first is that specifically human ability to cognize as well as to produce scientific theories are the results of natural selection and are tightly coupled with evolution of specific human language. The second one is that evolution of scientific theories is moving towards more and more successful and precise ones. Selection of scientific theories is analogous to selection of organisms both are based on the criterion of fitness. All living beings are also problem solvers; problems arise together with origin of life.
Popper proposes the following sketch of the evolution of theories. When the particular problem arose this lead to a number of attempts to solve it. A lot of tentative theories are generated and then each of them critically examined, checked on the presence of errors and so on. This process is analogous to Darwinian selection. The given theory is considered as true until we could not find errors in it. It is the essence of Popperian critical method. As we see this method requires theory to be falsifiable, in other words we must have at least possibility to find an error in the theory otherwise it will be nonsensical.
When one problem has been already solved the solution gives rise to a number of new questions and the whole sequence have to be repeated. So, the evolution of theories can be represented in the following symbolic form:
P1 -> TT -> EE -> P2,
where P1 is initial problem; TT tentative theories; EE error elimination and P2 new problems.
Popper gives striking example to illustrate evolutionary epistemology. He says that it is only one step from amoeba to Einstein. Both the amoeba, which have to survive in the given conditions, as well as Einstein, who develops physical theories, solve their problems (P) by trial (TT) and error elimination (EE) method. And what is the difference between them? Popper answers that the difference is in the manner of error elimination. The amoeba does not realize the elimination process, amoebas errors are eliminated through the elimination of amoeba itself; this is a natural selection. On the other hand Einstein has a language and can use it to find errors in his own theories. Popper emphasizes that the main difference between humans and other species is the existence of language.
The language is an instrument for abstraction of knowledge from reality. We use language to create a medium where tentative theories could be tested. People create language models of external reality in their minds, which helps them to verify theories without direct interaction with the world.
It is of importance to note that not only human can have knowledge from the viewpoint of the evolutionary epistemology. The knowledge is viewed in more broad sense so any adaptation can be interpreted as knowledge. This is very fruitful idea, which allows constructing of unified epistemological theory for all living creatures.
Another philosopher who made an important contribution to the evolutionary theory of knowledge was Donald Campbell; he also introduced the term evolutionary epistemology. Campbell proposed the following three main notions :
1. the principle of blind-variation-and-selective-retention, which notes that at the lowest level, the processes that generate potential new knowledge are "blind", i.e. they do not have foresight or foreknowledge about what they will find; out of these blind trials, however, the bad ones will be eliminated while the good ones are retained;
2. the concept of a vicarious selector: once "fit" knowledge has been retained in memory, new trials do not need to be blind anymore, since now they will be selected internally by comparison with that knowledge, before they can undergo selection by the environment; thus, knowledge functions as a selector, vicariously anticipating the selection by the environment;
3. the organization of vicarious selectors as a "nested hierarchy": a retained selector itself can undergo variation and selection by another selector, at a higher hierarchical level. This allows the development of multilevel cognitive organization, leading to ever more intelligent and adaptive systems. The emergence of a higher-level vicarious selector can be seen as a metasystem transition.
The vicarious selector is an instrument for knowledge development, which emerged through the process of organisms adaptation to the environment during evolution. This conception is a significant step in understanding the nature of knowledge. We can consider animal instincts, Kantian a priori ideas, individual living experience, human culture as an examples of vicarious selectors. The vicarious selectors are produced on the different levels of development of cognizing systems (from primitive organisms to human society) forming hierarchy in which some selectors have to fit to another. Tentative hypothesis has to pass through the whole hierarchy before the subject in his interaction with environment will use it. The whole nested hierarchy of the vicarious selectors can be seen as a model of the world in the mind of the subject.
We have begun to talk about models and it is time to turn to the cybernetic interpretation of evolutionary theory of knowledge.
Studying the theory of knowledge in evolutionary framework we deal with the problem how external world and the knowledge interact and also how the new knowledge is built. Touched upon the subject of knowledge organization we have mentioned that knowledge could be represented as a nested hierarchy of vicarious selectors. Lets try to look into the question: What is the structure of knowledge?
What is the essence of knowledge for the living organism? How knowledge can be used by it? The distinguishable feature of life is purposefulness. From teleological point of view knowledge is an instrument that used to achieve a certain goal. It is an instrument that is needed to control the self-state and the state of some part of surrounding world and this control is performed to reach the target state. It is natural to consider the evolution of living systems as development of hierarchical systems of control. This was done by Valentin Turchin in his book The Phenomenon of Science . Also we can find similar idea in the fundamental work of Popper called The Logic of Scientific Discovery .
In the middle of the last century cybernetics have been evoked due to the necessity to investigate problems of communication and control. And what is the core idea of the cybernetic approach to the theory of knowledge? Here we quoted Turchin:
We refrain from identification of our theories with reality. We see theories as no more than certain ways of organizing and partially predicting the flow of our perception. We have found that indeterminacy is in the nature of things. With the advent of cybernetics and computers, we started creating models of human perception and systematically exploring various ways of the organization of sensory data. Thus the emphasis shifted from matter to organization, from hardware to software. We now regard organization as more fundamental and primary aspect of things than their material content, which, after all, is nothing else but one of the notions we use to organize sensory data. Therefore, we call our basic philosophy cybernetic .
The knowledge is some dynamical model of environment in the mind of subject (cybernetic system) from the cybernetic epistemology viewpoint. The system needs to produce definite action to reach the goal state. And the knowledge is required to choose the right action. The knowledge must give ability to predict result of certain action for the certain environment conditions before this action is produced. Therefore the system should necessarily have a model to generate predictions.
Fig. 1. Modeling scheme.
The modeling scheme shown on the figure is a representation of relation between dynamics of the world and internal modeling in system. The first representation that is some internal state of the cybernetic system used to generate prediction. This calculated prediction is new representation, which expected to correlate with the future state of the world. Modeling is used to choose the action that will return representation closest to the desired state after the system put it in the model as an input. Such modeling scheme is rather universal and can be applied almost to every cognition process. Turchin describes this universality in the following way:
This definition describes equally well the case of a dog catching in flight a piece of sugar, and an astronomer who computes the position of a planet in the sky. In the first case, the model is built in the material of nervous cells of the dog's brain, in the second case from the signs that the astronomer writes on paper when he makes computations. .
In some cases the action chosen through modeling may be empty. It means that system does not perform any action and just observes events in the outside world.
But this modeling scheme cannot be applied when observation or modeling processes could affect dynamics of the world. This assumption, which is called non-interference, holds easily for macrosystems but for quantum, social and psychological phenomena it will be more complicated. In these cases model must include description of interrelation between environment and system itself, i.e. be recursive.
Generated predictions do not necessarily have to be confirmed by direct observations. Predictions, which cannot be verified directly, may be used in the process of generation of another predictions. So we can see the knowledge as a hierarchical generator of predictions.
In the hierarchical generation of knowledge (of models) models on the lower levels are produced from models on the higher levels. The model, which is on a higher level in the hierarchy of generators, is the more abstract one. We can evaluate abstraction in two ways by its scope and level. The wider the area of applicability of predictions generated with the aid of the given model the more abstract it is. The level of abstraction can be defined in the following way: The level of an abstraction is the number of metasystem transitions involved. In the modeling scheme, the subject-of-knowledge system S is a metasystem with respect to the world. Indeed, S controls the world: it takes information from the world as input to create a representation, processes and chooses a certain action a, and executes it at the output, changing thereby the state of the world. The brain of S, as the carrier of world representations, is on the meta-level, a ``meta-world'', so to say. Assigning to the world level 0, we define the abstractions implemented in the brain as abstractions of the first level. . Then the next metasystem emerges for which the first level of abstraction will be represented as the world and then abstractions of the second level appear and so on following complexification of cognizing system.
Knowledge can be seen as factual or theoretical according to the level of its abstraction. If the path from a statement to verifiable predictions is short and uncontroversial, we call it factual. A theory is a statement, which can generate a wide scope of predictions, but only through some intermediate steps, such as reasoning, computation, the use of other statements. Thus the path from a theory to predictions may not be unique and often becomes debatable. Between the extreme cases of statements that are clearly facts and those which are clearly theories there is a whole spectrum of intermediate cases. . Turchin derive criteria for theories evaluation out of this, which is equal to critical method of Popper: Since theories usually produce infinite number of predictions, they can not be directly verified. But they can be refuted. For this end it suffices to find one false prediction And we should accept critically both facts and theories, and re-examine them whenever necessary..
There is no whole view on the problem of knowledge in the modern philosophy. It is fragmented and often not adequate to the up-to-date scientific worldview. The evolutionary-cybernetic epistemology proposes its own approach to the construction of integrated theory. In its framework the man, as cognizing entity, is seen as the part of the continuing process of the evolution of nature. Through the evolution the living organisms are gaining more and more sophisticated instruments for modeling of processes in the environment. These were single receptors in the beginning then neural network, the brain and after all the human language. People use all these instruments in their purposeful activity. The main criterion of knowledge truth is its utility for the achievement of the certain goal. Knowledge is life.
1. Campbell D.T. Evolutionary Epistemology. In: Schilpp P.A. (ed) The Philosophy of Karl Popper. Open Court Publish., La Salle, Ill., 1974, pp 413-463.
2. Popper, Karl R. Evolutionary epistemology. In J. W. Pollard (Ed.), Evolutionary theory: Paths into the future, 1984 Chichester: Wiley, (pp. 239-254).
3. Popper, Karl R. Natural selection and the emergence of mind. In Gerard Radnitzky & W. W. Bartley, III. (Eds.), Evolutionary epistemology, theory of rationality, and the sociology of knowledge. La Salle, IL: Open Court, 1987, pp. 137-155
4. Popper, Karl R. The logic of scientific discovery. New York: Basic. 1959
5. Turchin V.F. The Phenomenon of Science, Columbia University Press New York, 1977. http://pespmc1.vub.ac.be/POSBOOK.html
6. Journal of Memetics Evolutionary Models of Information Transmission, http://www.cpm.mmu.ac.uk/jom-emit
7. Heylighen F. Epistemological Constructivism. In: F. Heylighen, C. Joslyn and V. Turchin (editors): Principia Cybernetica Web (Principia Cybernetica, Brussels), http://pespmc1.vub.ac.be/CONSTRUC.html , 1997.
8. Heylighen F. Objective, subjective and intersubjective selectors of knowledge. Evolution and Cognition Vol.3, No.1, 1997, p. 63-67. http://pespmc1.vub.ac.be/papers/knowledgeselectors.html
9. Heylighen F. Evolution of Memes on the Network: from chain-letters to the global brain. In: Ars Electronica Festival 96. Memesis: the future of evolution, G. Stocker & C. Schopf (eds.) (Springer, Vienna/New York), 1996, p. 48-57.
10. Heylighen F. Evolutionary Epistemology. In: F. Heylighen, C. Joslyn and V. Turchin (editors): Principia Cybernetica Web (Principia Cybernetica, Brussels), http://pespmc1.vub.ac.be/EVOLEPIST.html , 1995.
11. Principia Cybernetica Project, http://pespmc1.vub.ac.be
12. Turchin V. On Cybernetic Epistemology. Systems Research, Vol.10, No.1, 1993, p. 3-28.