"Cybernetics is the theory of command, control and
communication in the animal and in the machine" (Norbert Wiener).
"The basic ideas of cybernetics can be treated without any reference to electronics and are
without any reference to electronics and are fundamentally simple" (William Ross Ashby).
"The diversity of nature can be explained in the cybernetic framework" (Heikki Hyötyniemi).
The Cybernetic Theory
The origin of a new science
The word "cybernetics" comes from the Greek word "kybernetes", which means "art of steering or steering a ship" as a pilot or helmsman. From the Greek term comes the Latin "gubernator", with a similar meaning. Plato used this term in The Republic with the meaning of "art of governing", and also as "art of leading men" and, in general, "art of guiding".
The word "cybernétique" was also used in 1834 by the physicist André-Marie Ampère to refer to the sciences of government (or art of governing in politics) in his famous "Essay on the Philosophy of the Sciences," where he established a system for classifying human knowledge. "The future science of government should be called 'la cybernétique'."
In 1938, mathematician Louis Couffignal published a pioneering article in Europe magazine where he speculated on the possibilities of machines, capable of making their own decisions and even replacing man. He also drew analogies between the human nervous system and the internal structure of machines.
Cybernetics as a science began in 1948 with the publication of the mathematician Norbert Wiener's work "Cybernetics: or Control and Communications in the Animal and the Machine". This work became a best seller and initiated one of the most spectacular and influential scientific movements. Wiener is considered the father of cybernetics.
Wiener's cybernetics is an abstraction inspired by practical problems raised during World War II, specifically the need to develop automatic control mechanisms for military air defense equipment. Wiener constructed a general theory of organization and control systems, with the concept of feedback (associated with the achievement of a goal) as its most important foundation, a general concept also present in nature, in living beings and in social systems.
The definitions
There are many definitions of cybernetics:
"The whole field of the theory of command and communication, both in the machine and in the animal." This was Wiener's original definition, which he later extended to other fields, especially to that of the human sciences.
The interdisciplinary study of self-regulating systems based on feedback and interaction with the environment.
The study of communication and control systems in machines, organizations and living organisms.
The study of teleological systems (from the Greek "telos", goal or objective).
"The branch of mathematics that deals with problems of control, recursion, and information" (Gregory Bateson).
"The study of the organizational relationships that the components of a system must have in order to exist as an autonomous entity" (Heinz von Foester).
"The science of effective organization" (Anthony Stafford Beer, expert in organization theory).
"The art of ensuring the effectiveness of action" (Louis Couffignal).
"The study of systems and processes that interact with themselves and produce themselves from themselves" (Louis Kauffman).
"The science that studies the abstract principles of organizations in complex systems" (Francis Heylighen & Cliff Joslyn).
"A family of approaches to studying complex systems" (Heikki Hyötyniemi).
The general science of organisms, whether natural or artificial.
An abstract model of a living thing, or the art of imitating a living thing.
From all these definitions, we extract 3 fundamental concepts, which correspond to universal characteristics of many systems, including natural ones:
Feedback (feedback), also called "feedback" and "feedback", based on the interaction of a system with the environment.
Teleology: the orientation of a system to the achievement of a certain goal.
Self-regulation: the ability of a system to control itself, autonomously to achieve its goal.
Feedback
When the environment of a system changes, the system changes to adapt to the new conditions of that environment, always goal-oriented.
In feedback, the output (or part of it) produced by a system becomes input. What is called a "feedback loop" occurs. According to Norbert Wiener, "Feedback is a method of controlling a system by which the results of its past performance are reinserted into it." Feedback is considered "the secret" of nature.
It is necessary to differentiate between linear process and process with feedback loop:
In the linear process, the system acts in a fixed way, with no possibility of adaptation to changing circumstances in the environment. Examples: 1) a computer program that acts based solely on input data; 2) dictatorship is a mechanism without a feedback loop.
In the process with feedback loop (or cyclic), the system takes into account the environment and modifies its behavior. Examples: 1) democracy is a system with a feedback loop constituted by the practice of voting and the expression of public opinion; 2) interpersonal systems, which can be understood as feedback loops, since each person's behavior affects the others and in turn his behavior is affected by the others.
There are several types of feedback. The most prominent are negative feedback, positive feedback, and compensated feedback:
In negative feedback, a variation of the output magnitude causes in the following process a new variation in the opposite direction of that output magnitude. This feedback produces convergence, bringing the system closer to the goal, to a stable final state. For example, a projectile that can rectify its own trajectory according to the location information of a moving target to achieve its goal.
In positive feedback, a variation of the output magnitude causes in the following process a new variation in the same direction of that output magnitude. This feedback produces divergence, moving the system away from the goal which is a stable final state. Examples in nature: forest fires, bacterial growth in a certain environment, a snowball rolling down a slope, nuclear chain reactions, etc. An example of a psychological type is: Pepe smokes when he is nervous; he worries about the effects of smoking on his health, which makes him more nervous and smoke more; he enters a "vicious spiral". An example of a physical type is the phenomenon of acoustic coupling (or Larsen effect): when the sound reproduced by the loudspeakers is picked up again by the microphone to be amplified again, producing the well-known squeal. A political example is the arms race. Spiral development is another example of positive feedback [see Appendix - Spiral Development].
Positive feedback does not grow or diverge indefinitely. Eventually there is a constraint imposed by the environment (available resources) and the subsequent appearance of negative feedback. For example, some rabbits that breed and eat grass; the more they breed the less grass per head there is; in the end the limitation of available grass causes them to breed less.
In compensated feedback, the controller makes positive or negative feedbacks, depending on the circumstances, to keep the system stable. For example, a thermostat, a refrigerator or an air-conditioning system, all of which are required to maintain a stable temperature within limits.
Teleology
Deterministic systems are defined from initial conditions. Teleological systems are conditioned on the achievement of a goal or the attainment of a final result. These conditions (initial or final, respectively) influence the intermediate states.
Physical systems tend to be deterministic.
Biological systems tend to be teleological.
At the psychological level (according to Paul Watzlawick) Freudian psychoanalysis belongs to the deterministic school. Jung's analytical psychology is largely based on the assumption of a teleological principle immanent in the human being.
When the goal is defined, there is "equifinality," i.e., different initial states lead to the same final state. The final state is also called "attractor".
When a system is in equilibrium, its objective is to maintain it, so the system returns to that state after an internal or external perturbation.
In complex cybernetic systems (such as organisms or organizations) the objectives are presented as a hierarchy. There is a primary or essential goal (such as survival) and subsidiary or secondary goals, where one goal at one level controls others at the next lower level. A simple example is that of a room regulated by a thermostat with an infrared sensor that detects the presence or absence of people, its objective being to save energy.
Self-regulation
The state of a cybernetic system is defined by a number of essential parameters. To achieve its goal, a cybernetic system is self-regulating, i.e. it uses an internal control mechanism (also called regulator, controller, regulatory system or control system) to adapt to environmental conditions and perturbations by changing its essential parameters. Self-regulation implies autonomy. Nature is the most extraordinary cybernetic system, as it is always self-regulating.
There are 3 control mechanisms:
Buffering (damping).
Buffering is the passive absorption of disturbances. For example, in a room controlled by a thermostat, a wall acts as a buffer against outside temperature changes.
Feedback.
In feedback there is a regulator that handles disturbances after it affects the essential parameters of the system. It has the advantage that it is easier to implement, but has the disadvantage that it must allow for a deviation before taking action.
Feedforward (direct feed).
In direct feedforward there is a controller that handles disturbances before they affect essential system parameters. For example, a room controlled by a thermostat, with a sensor outside the room alerting the thermostat to variations in the outside temperature so that it makes a decision before it affects the inside of the room. This control system has the advantage of anticipation, but the disadvantage that it is often difficult to implement in practice. However, this system becomes necessary when disturbances occur so quickly that a feedback-type reaction would come too late.
The presence of control in a cybernetic system means that the system can be conceived of as two interacting units: the controller and the controlled. In a feedforward system, the controller receives the input signal from the environment and passes a control signal to the controlled system. In a feedback system, the controlled system receives the signal from the environment and passes the same signal to the controller, which returns a control signal to the controlled system.
For Wiener, every being (biological, artificial or mechanical) can be defined by the nature of information exchanges with the environment. The purpose of cybernetics is to understand information behaviors. Communication is conceived as an exchange of information with the environment. The true nature of all beings is found in their communication relationships. Cybernetic communication is conceived as the behavior of information exchanges.
Homeostasis
To Walter Cannon we owe the concept of homeostasis (from the Greek "homo", analog and "stasis", stability). This concept dates from 1939, prior to Wiener's work. It is the characteristic of a system that regulates itself internally to maintain its stability (within certain limits) in the face of external (environmental) or internal perturbations. This concept is generic and applies, not only to cybernetic systems, but also to biology (living organisms), psychology, sociology, etc.
Cybernetic homeostasis. In 1951, William Ross Ashby designed a mechanism that he called precisely "homeostat" or "homeostatic machine", which showed an ultra-stable behavior against the perturbation of its essential parameters. His ideas were reflected in his work "Design for a Brain".
Biological homeostasis. Living things tend toward equilibrium. The components of a biological system are constantly changing to maintain within certain limits (the spectrum) the outcome of the whole. A disease is a state that requires an extra input of energy to reach equilibrium.
Social homeostasis. Societies tend to stability thanks to norms, customs, traditions and habits.
Psychological homeostasis. At the individual level, with our behavior, we seek a balance between our desires and needs and their attainment.
Entropy. The thermodynamic view of cybernetics
The most universal framework governing all physical systems is thermodynamics. And this science has two laws. The first law of thermodynamics is the principle of conservation of energy in an isolated system. The second law states that entropy always increases. The traditional conception of entropy is that it is a measure of the level of disorder in a system. The greater the disorder, the greater the entropy. The inverse concept of entropy is neguentropy, which is a measure of order. Therefore, according to the second law, every system tends towards greater disorder, towards less order, entropy increases and neguentropy decreases.
Entropy is a concept introduced by Rudolf Clausius in the 1850s and later formalized at the mathematical level by Ludwig Boltzmann, relating it to probability. The word "entropy" comes from Greek and means "transformation".
All systems, even cybernetic ones, are governed by the law of entropy: every system always "pursues" to increase its entropy and moreover it does so at the maximum possible speed.
There are different interpretations of the entropy law, depending on the system under consideration:
Thermodynamic systems evolve towards the most probable states, which are the most homogeneous.
Information systems evolve toward states with less information.
Cybernetic systems, including biological ones are "fed" with information and produce less information.
The goal of a cybernetic system is an equilibrium state, which is a state of maximum probability and minimum information. When the system deviates from the equilibrium state, the deviation produced is information. The better and more efficient the control system (which brings the system back to equilibrium), the greater its entropy variation per unit of time.
It is often said that entropy (disorder) always increases in the physical world and that neguentropy (order) always increases in biological systems. But this is not true: in both cases the equilibrium state is pursued, the most probable state, the one with the least information. Therefore, entropy should not be associated with disorder, but with maximum probability and minimum information. We can say that entropy is the opposite of information: the higher the entropy, the lower the information.
Other characteristics of cybernetic systems
Dimension.
The dimension of a cybernetic system is the number of essential and independent parameters that characterize its internal state. They are its degrees of freedom.
State space.
It is the set of possible states of a system (determined by the values that each essential parameter can take).
Signals.
Signals are communications between the elements of a system (internal signals) or between the system and the environment (external signals). These signals can be of analog or digital type. The latter are information or messages. If all signals are continuous, then the system is called "continuous". If they are all discrete, the system is called "discrete". There can be mixed systems. An example of an analog signal is the intensity of light or sound. A digital signal is a command to turn a light on or off. When the increment between discrete values is very small, the signal can be considered continuous. With respect to time, a signal can be continuous or discrete.
Open and closed systems.
A closed system is a system that exchanges signals only internally, without communication with the outside. It usually has well-defined boundaries, which act as borders, preventing interaction with the environment.
An open system is one that communicates with the environment by means of: a) receivers (sensitive devices) that receive signals from the environment and transmit them to the system; b) effectors that transmit signals from the system to the environment. A human being is an open system, with sense organs as receptors and speech, facial expressions, body movement, etc., as effectors.
Level of complexity.
The complexity of a cybernetic system is determined by the number of relationships between its elements.
Memory.
A cybernetic system has memory when it can internally accumulate information in some form. A system without simple memory is a buzzer: when the button is pressed, the buzzer sounds; the input signal becomes an output signal, and the system does not "remember" anything. There are two ways of memorizing information: 1) by a change of state; 2) by a change of the system structure. There may be a mixed variant.
Types of change.
There are two types of change in the behavior of a system, depending on the way of "memorizing" information: 1) self-adjustment: the system changes the values of essential parameters; 2) self-organization: the system changes the structure of the system.
Models.
A cybernetic model is an abstraction, a simplified representation of a system in which we select certain features and ignore others, depending on the objective pursued by the model built by the creators. There may be different possible models that achieve the same objective. Therefore, we must distinguish between cybernetic model and cybernetic system. We can then establish a Korzynski-type analogy: the map is not the territory and the model is not the system. It is often said that all models are always false because the essence of the real world cannot be captured. "Essentially, all models are wrong, but some are useful" (George Box).
Second-order cybernetics
In 1972, anthropologist Margaret Mead, president of the American Society for Cybernetics (ASC), devoted one of her speeches to the topic "Cybernetics of Cybernetics." Heinz von Foerster [1974] suggested changing this name to "Cybernetics of the second order". It refers to the cybernetics of observing systems, to differentiate it from first-order cybernetics or cybernetics of observed systems (Wiener's cybernetics).
Traditional (first-order) cybernetics considers that reality is objective and exists independently of the observer. Second-order cybernetics is of a reflexive type, it includes the observer, who can be considered a meta-system that observes, controls or interacts with the base cybernetic system. An example of this type is a social system.
Second-order cybernetics is primarily inspired by advances in modern physics. At the quantum level, the observer influences or affects the observed. And at the relativistic level, the observer perceives according to his reference system.
First-order cybernetic systems do not change their goals as long as they are not given new instructions to do so, from the outside. For example, a thermostat (the external command is to set a new temperature). Second-order cybernetic systems can dynamically modify their objective, autonomously.
Wittgenstein (philosopher and logician), Warren McCulloch (neurophysiologist), Gordon Pasq (psychologist), as well as Humberto Maturana and Francisco Varela (biologists) also contributed to cement this concept.
For von Foerster [1974]:
Of all the cybernetic principles, one occupies the central theme, and that is circularity. Circularity is an essential concept, not only for cybernetics, but for science in general, society, human relations, learning, therapeutic processes, management of organizations, etc. It represents a fundamental epistemological contribution.
Circularity apparently violates the basic principle of scientific discourse that demands a separation between opposite or dual concepts: cause and effect, observer and observed, etc. In the case of the observer-observed pair, circularity breaks the principle of objectivity (the observer is something independent of the observed).
The traditional defense of objectivity is based on the fear of paradoxes arising.
An "ultimate science", descriptive and objective of the world, without considering the subjects, leads to contradictions. To eliminate these contradictions, one must take into account the observer, who participates in what is observed.
Observations are not absolute (objective), but relative to the observer (subjective). Without an observer there is no observation. The observer and the observed are inseparable.
In cybernetics it is impossible to access a level higher than the second, because "when reflection is reflected upon, the circuit of argumentation is closed and the organizational closure is produced, which can only transcend itself within itself". That is, when the observation mechanism is included, the cycle is completed and the observation mechanism can be applied recursively.
Third-order cybernetics
Despite von Foerster's above statement, some authors speak of a third-order cybernetics, with two characteristics:
There is mutual interrelation between the base level and the target level, thus forming an interactive circularity and a true higher unity. The observer and the system co-evolve. The observer must change his behavior in order to be able to recognize the observed system; if he does not change, he will cease to recognize it.
The system is aware of its environment and recognizes how it self-regulates to adapt to that environment.
In second-order cybernetics, there is causality from the goal level to the base level. With third-order cybernetics, the causal cycle is closed. This implies that the observer not only affects or influences the observed, but that the observed changes the observer.
An example of third-order cybernetics is an orchestra, in which each musician not only listens to his own sound but also to that of all his fellow musicians, and adapts his own sound according to what he hears from his fellow musicians.
The cybernetics of living beings: autopoiesis
Living beings can be considered cybernetic systems whose goal is stability and evolution, within an environment with which they interact.
Magoroh Maruyama [1963] (before the emergence of the concept of second-order cybernetics) claimed that living things behave like a "second cybernetics". All living things depend for their survival on two processes: 1) Morphostasis, the negative feedback that tends to stabilize it; 2) Morphogenesis, the positive or amplifying feedback. These two processes balance each other.
The term "autopoiesis" is a neologism created in 1971 by Humberto Maturana and Francisco Varela (described in their work "The Tree of Knowledge"). It comes from the Greek: "auto" (self) and "poiesis" (creation or production).
Autopoiesis explains the organization of biological systems, the distinctive capacity of living beings to maintain and evolve their own internal organization, their structure, to self-produce, to self-regenerate by means of a circular, self-referential or re-entrant organization. An autopoietic system produces itself continuously using resources from the environment, in such a way that producer and product, doing and being, subject and object, are the same thing.
Living beings are self-referent autonomous beings. Not every autonomous entity is a living entity. Self-reference is a type of autonomy and is what characterizes living beings. Living systems are simultaneously autonomous systems and dependent on the environment.
The theory of autopoiesis, while relying on cybernetic theory, contributes two important concepts:
Structural coupling.
It refers to the capacity of a living being to evolve, to restructure, to constantly change its structure in a flexible and congruent way with the modifications of the environment. Its structural dynamics, its possible structural changes are predetermined. There is structural determinism: what happens to the living being depends on its structure. This circular structural coupling, of constant dialogue being-environment, occurs at multiple levels. "Autopoietic systems have neither inputs nor outputs. They can be perturbed by independent events and undergo internal structural changes that compensate for these perturbations" [Maturana & Varela, 1980].
Operational closure.
Living beings are closed systems from an operational or functional point of view. For life to be possible, it is necessary for the living being to be closed to the environment, in such a way that, in the face of the dynamics of the environment, its functionality, its identity, its autonomy, its totality remain unchanged. Operational closure is due precisely to its self-referential quality. The nervous system of the living being has operational closure. "The circularity of living and social systems is indeed Ariadne's thread that allows us to understand their capacity for autonomy" (Francisco Varela).
That is, living beings are structurally open and functionally closed.
Organization is the essence of life. It is the set of possible relationships of the autopoietic processes of an organism. It is its autopoietic "space", its space of freedom.
The structure is the accidental of life. It is the selection, at each moment, of organized wholes to sustain functionality.
When a perturbation occurs in the environment, a "structural coupling" takes place which, maintaining the organization, leads to structural changes for the reestablishment of the homostatic equilibrium.
There are 3 types or orders of autopoietic systems: 1) cells; 2) organisms (cellular aggregates), which have nervous systems; 3) aggregates of organisms (families, societies, colonies, beehives, etc.), whose main characteristic is not their components (organisms), but the relationships between them. According to Maturana and Varela [1980], the establishment of an autopoietic system is not a gradual process; a system is either autopoietic or it is not.
At a fundamental level, the goal of an autonomous or autopoietic system is survival, i.e., the maintenance of its essential organization. And there are subsidiary goals such as: maintaining its temperature, eating, etc. that contribute to its survival.
Artificial systems, such as a thermostat or an autopilot, are apparently autonomous, but they are not really autonomous because their primary purpose is implemented by their designers. These systems are said to be "allopoietic". Their function is to produce something other than itself.
Self-reproduction can be considered as a special case of autopoiesis, where the self-produced components are not used to regenerate the system, but to form a copy of the system.
The concept of autopoiesis has overflowed the boundaries of biology to be applied in other domains such as sociology, anthropology, psychotherapy, etc., having become a worldview, an important concept for investigating reality and for modeling many types of systems. For example, the sociologist Niklas Luhman [1996] has applied it to the study of societies in contexts of contingency and risk. Luhman's claim is to create a super-theory applicable to all social phenomena. His current work is considered one of the most important theoretical studies produced in the field of sociology. Highlights are:
Society is an autopoietic system, that is, it is autonomous and functions thanks to the production of its own components. It is autonomous, not only at the structural level, but also at the level of control of the organization of its structures.
Socialization is possible because an emergent form emerges, a new order of reality: a closed (autopoietic) network of communication. Communications take place within the system. Society is an operationally closed and self-referential network of communication.
Communication is the structural coupling and makes us maintain the social organization. Civilization is the consequence of communication.
Social systems are structurally determined systems.
Human beings are beings dependent on the emergent network of a higher type, which is society.
However, Varela rejected that autopoietic concepts could be applied to social systems, restricting it exclusively to biological systems.
The mind, a cybernetic system
According to Gregory Bateson [1993, 2002], the mind is a cybernetic system: an aggregate of interacting parts that exhibit a feedback structure. According to this author, where there is feedback there is mind. The complexity of systems with feedback (and, therefore, with mind) ranges from a simple thermostat to the human mind and beyond: the universal mind. The individual mind is only a subsystem of the universal mind. There is also a mind in the social system and in the planetary ecology.
Bateson speaks of an "ecology of mind," where there is a "connecting pattern" of mind and nature, inner and outer. This connecting pattern is a meta-pattern, a pattern of patterns that eliminates the dichotomy between mind and nature.
The Valuation of Cybernetics
Cybernetics, a universal science?
Many authors consider cybernetics to be a universal science or an interdisciplinary or transdisciplinary science, since cybernetic systems are found in all areas of knowledge. Wiener himself stated in his seminal 1948 work that "the most fruitful areas for the development of the sciences were those that had been forgotten as no man's land between various established fields."
Cybernetics is considered a crucial science related to a great multitude of fields, a science with fuzzy boundary or closely related to others such as: Mathematics, Computer Science, Logic, Information Theory, Automata Theory, Control Theory, Formal Language Theory, Artificial Intelligence, General Systems Theory (GST), Operations Research, Complexity Theory, System Dynamics, Decision Theory, Process Optimization Theory, Pattern Recognition and Machine Learning Theory, Organization Theory, Artificial Neural Network Theory, and Chaos Theory. Many authors consider that some of these disciplines are actually branches of cybernetics. Of all these disciplines, General Systems Theory is the one most closely linked to cybernetics, to such an extent that many authors consider these two disciplines to be inseparable.
Cybernetics has had a great influence on the birth of some of these sciences, especially Computer Science, Artificial Intelligence, Control Theory, Automata Theory, Complexity Theory, System Dynamics, Robotics and Bionics.
Many central concepts of these sciences were first explored by cyberneticists, such as: complexity, self-organization, autonomy, self-replication, adaptation, etc.
Other disciplines that can be contemplated under the cybernetic paradigm are: Neuroscience, Biology, Psychology, Sociology, Sociology, Cognitive Science, Anthropology, Game Theory, Philosophy and Organization Theory.
Wiener was a strong advocate of the application of cybernetics to the social sciences. He was convinced that the human behavior of animals and machines could be explained by applying the principles of cybernetics. He popularized the social implications of cybernetics by drawing analogies between automatic systems and human institutions in his second work "The Human Use of Human Beings. Cybernetics and Society" (The Human Use of Human Beings. Cybernetics and Society).
Other authors see cybernetics, not as a science, but as a set of general ideas that try to break the rigid boundaries of different disciplines. In fact, despite its historical pioneering role in many subjects, cybernetics was never established as an autonomous discipline. However, it is considered a sort of "mother science".
Because of its generality, cybernetics has been regarded as a kind of panacea, a privileged way of contemplating the world. It was even considered a motivation for communism: how to govern the world optimally. It has also been considered a new philosophy.
The emergence of cybernetics had a major impact. During the 1950s and 1960s many sciences rethought their own foundations in the light of cybernetic principles. Many books popularizing cybernetics depicted the word "cybernetics" in the center of a circle and the other disciplines around it, symbolizing the central role of cybernetics in the sciences. These books showed that, despite their apparent differences, everything (breathing, cooking, painting, education, etc.) could be explained by cybernetic principles, which were universal principles. Cybernetics seemed to be able to achieve the utopian project of relating all the dispersed knowledge in a universal, interdisciplinary or transdisciplinary science.
In the mid-1970s, the great wave of cybernetic publications experienced a sharp decline due to the great success of Computer Science, which did not fulfill the theoretical dream of the unity of knowledge, but provided a practical universal tool for information processing: the digital computer, based on Turing's universal machine. Informatics substituted knowledge for information, a concept of a lower level of abstraction but more manageable and practical.
During the 1950s many cyberneticists joined General Systems Theory (GST), founded around the same time by Ludwig von Bertalanffy, in an attempt to build a unified science by establishing (or discovering) general or universal principles governing all systems, natural and artificial. While cybernetic systems study self-regulating and goal-oriented systems, GST studies all kinds of general systems. Both sciences (cybernetics and GST) can be seen as attempts to create a transdisciplinary science based on the generic concept of system.
What can be stated about cybernetics is the following:
The principles of cybernetics are so general that this science has potentially universal application, having contributed greatly to improving our understanding of reality. They can be applied to understand, model and design systems of any kind, It can be applied to animate (living beings) and non-animate (automatic systems) systems, to economics, to psychology, to physical and social systems, etc. Cybernetics can be applied to nature, since everything in it consists of self-regulating processes, even in the mineral kingdom processes of this type have also been discovered.
These general concepts are of a theoretical type and are not conditioned by the form of physical implementation (electronics, mechanics, etc.) or of their material components. It is this characteristic that makes it possible to model very different systems. "The basic ideas of cybernetics can be approached without any reference to electronics and are fundamentally simple" (William Ross Ashby). In this theoretical sense, the analogy can be drawn: cybernetics is to systems as geometry is to material objects. Cybernetics is a general and abstract science of systems.
Cybernetics demonstrated that the barriers between different disciplines were surmountable because there were many analogies and similarities at the abstract level. Seemingly distant sciences such as mathematics, biology and electronics could be viewed together from the superior perspective of cybernetics.
Cybernetics is a simple science to deal with the complex (at the structural and functional level) and that its scope of study is all possible systems. It allows dealing with old problems in a simpler way, as it has generic and powerful conceptual resources. The cybernetic paradigm offers new intuitions and new perspectives to investigate reality and to model systems.
Philosophy of cybernetics
Apparently, cybernetics, by adopting a scientific and mechanistic approach in all domains (including social ones), could be interpreted as an attempt to eliminate philosophy, as happened with positivism from Comte onwards, which based knowledge on empiricism and scientific verificationism.
Heidegger in "The End of Philosophy and the Task of Thinking" (1964) attacks cybernetics as the consummation of the technical and superficial thinking of modernity, with the consequent oblivion of being. He calls for a deeper kind of thinking and the search for the being that is hidden behind all entities. And he ironizes: "While classical philosophy fades away, cybernetics becomes a philosophy for the twentieth century".
Wiener, who studied philosophy, insisted on the need for a cybernetic philosophy. And indeed, Cybernetics raises a number of philosophical problems, including the following:
The principles.
Cybernetic principles are very general or universal. Philosophy is interested in the universal, in the fundamental principles of reality.
Epistemology.
Cybernetics provides an epistemology, a theory of knowledge, based on the substitution of the abstract for the "real". Philosophy aims to apprehend the true essence of reality and thus achieve the unification of knowledge.
The man-machine relation.
Cybernetics deals with autonomous systems, which make their own decisions, even apparently "thinking". The philosophy is speculative, so we ask: can machines think, is man a machine or one of the possible implementations of a cybernetic system, can cybernetics help us understand ourselves better, will machines come to dominate man, and so on.
Cybernetics and consciousness
Cybernetics is a remarkable approach to the subject of consciousness by bringing together a whole series of opposites:
Input and output. In feedback, output (or part of it) becomes input.
The system and its environment, the internal and the external, the active (the system) and the passive (the environment), the subjective and the objective, the linear and the circular, the observer and the observed, the being and the doing, the producer and the product.
Analysis and synthesis, reductionism and holism.
The initial state and the final objective-state. A cybernetic system seeks from its beginning the final state, the state of equilibrium, thus uniting the beginning and the end. The final state (the purpose) is associated with the future (the synthetic consciousness) and the initial state is associated with the past (the analytical consciousness).
The general and the particular. The cybernetic concepts are so general that they can be applied to many particular systems.
The animate and the inanimate. Cybernetic principles attempt to capture what is common to both realms.
The controller and the controlled.
The receiver and the effector (perception and action).
Positive and negative feedback in living things and in many systems seeking equilibrium.
Autonomy and dependence on the environment.
Conservation and evolution.
Theory and practice.
Freedom and determinism.
The simple and the complex.
The mind and the body.
The individual and the social.
Idealism and realism.
The temporal and the timeless.
The digital and the analogical.
It is a paradox, not conceivable with traditional dichotomous thinking (true/false, yes/no, etc.), but with a model of thought in which analytical and synthetic thinking, linear and circular, etc., are compatible. In short, with the unity of opposites. This dialectic of opposites is at the basis of consciousness and life. Circularity, although an essential concept, is only a particular case of the general concept of "union of opposites".
Gregory Bateson [1993] considered cybernetics a key discipline, very important: "I think that cybernetics is the greatest fruit of the tree of knowledge that mankind has borne in the last 2000 years". But now we can better understand its importance because of its close relationship with the subject of consciousness.
The symbol that best represents the union of opposites is the Ouroboros, the mythical serpent that engulfs its own tail, thus forming a circle. The circle, having neither beginning nor end, is the symbol of God, of the soul, of consciousness, of the absolute, of eternity, of perfection, of indivisible totality, of unity, of the immutable, of the indestructible essence and of the permanence of all things. And the serpent symbolizes:
Consciousness, represented by the union of opposites to form a unity of a higher order.
Feedback, self-reference, endless identity.
The cyclical nature of things, which never disappear but change perpetually in an eternal cycle of creation and destruction. It is the eternal return and continuity of life, self-regeneration, renewal and evolution.
Ouroboros
In some representations the serpent is depicted with a light and a dark half, at the same time referring to the dichotomy of cyclical processes (day-night, etc.).
The problematic of cybernetics
Cybernetics has been criticized and even questioned for different reasons:
For lacking perfectly clear and coherent principles. The generality and level of abstraction of its concepts are often ambiguous. Hence the variety of definitions mentioned above. But, according to von Foerster [1995], this is a positive aspect: "This is due to the richness of its conceptual basis, and I think this is a good thing, otherwise cybernetics would become a boring exercise".
Because of its fuzzy border with many disciplines.
For borrowing many ideas from mathematics, biology and electronics.
Because there is a certain confusion and indefiniteness of cybernetic levels.
Because many disciplines that cybernetics inspired (such as Computer Science and Artificial Intelligence) underwent a rapid evolution, declared themselves independent forgot their origin, which has contributed to empty cybernetics of content. Even Robotics is considered an independent science.
Because it is limited only to specialists, so it has not become popular.
Because the prefix "cyb" or "cyber" applied to a hybrid between robot and living being (cyborgs) and to the Internet (cyberspace) has diluted its meaning. The term "cyberspace" was introduced by William Gibson in his science fiction novel "Neuromancer".
Finally, and most importantly, for lacking a formal language. Wiener himself stated that "It is the purpose of cybernetics to develop a language and techniques that will enable us to really attack the problem of control and communication in general." Bateson claimed that it was essential to create a new language to express the concepts of cybernetics.
The cybernetic language must make it possible to represent all knowledge: communication, control, feedback, information, etc.
Language is the unifying element of everything, not only of cybernetics, but of all formal sciences, including Computer Science and Artificial Intelligence.
But today there is no such standard language capable of modeling cybernetic systems. Mathematics, programming languages, simulation languages, etc. are often used.
Cybernetics vs. Artificial Intelligence
Cybernetics and Artificial Intelligence have been in dispute over the construction of intelligent systems. The cybernetics movement started first (1948, with Wiener), with its golden age being the 1950s. Artificial Intelligence (AI) was officially born in 1956 at the famous Dartmouth College Conference (Hanover, New Hampshire) and dominated mainly between 1960 and 1985.
AI focused mainly on brain-hardware and mind-software analogies or metaphors, i.e., the mind/brain as a universal information processor. Hardware based on digital electronics and software based on the concept of information.
Cybernetics put more emphasis on the analogy between biological systems and autonomous machines, speculating on whether they would one day dominate man.
However, the boundary between cybernetics and AI is fuzzy. For example, when a system reacts appropriately with its environment, the illusion of intelligence emerges.
Moreover, because of the problems described above, many authors and researchers have proposed that AI should be the science devoted to the theory and development of systems that behave intelligently. In fact, cybernetics conferences have in practice become AI conferences. However, around the year 2000, there was a movement back to cybernetics, mainly for 3 reasons:
Because of the need for a general, integrating or unifying conceptual framework for a whole series of more or less dispersed disciplines.
Because of the repeated failures of AI, which has disappointed the great expectations placed in it.
Because a deep AI system requires interaction with its environment, so we need a "cybernetic AI".
Today, however, the goals of AI and cybernetics are increasingly close, to such an extent that they tend to be confused. Sometimes even the acronym AI (Artificial Intelligence) is interpreted as "Agent Intelligence".
Cybernetics and Computer
Today, computer science has become the most useful technical resource for researching and developing cybernetic models, both real and simulated. In fact, the evolution of cybernetics (such as Artificial Intelligence) is due, in large part, to computers and their simulation programs. The computer itself is a cybernetic system, and is therefore the object of research by cybernetics. It is paradoxical that a cybernetic system such as the computer is used to investigate cybernetic systems. But the computer is really a meta-machine, a machine capable of being configured as any particular machine.
Simulation programs (such as Simula, Simscript, Slen, Nedis, etc.) each use a different language. And a computer system also requires a language. And here we have a semantic gap, a semantic gap between the two languages. Anyway, there are translators that convert the description of a cybernetic system into computer code.
MENTAL, a Language for Cybernetics
Cybernetics has had a great impact precisely because of its close relationship with the subject of consciousness, because of the union of opposites. But it lacks a formal language capable of expressing its fundamental concepts.
MENTAL is a universal language based on consciousness, on the union of opposites that can be applied to cybernetics, because it allows to easily express and implement feedback, communication with the environment and self-regulation, including the modification of its own structure. MENTAL is at a higher level of abstraction than cybernetics, so cybernetics is nothing more than an application of MENTAL. Moreover, MENTAL, due to its universality, can also be applied to Computer Science and Artificial Intelligence.
Generic expressions.
With generic expressions it is possible to express a network of relationships (of dependence and interdependence) between variables (or expressions in general) of a system, to share elements of different systems, etc. The cybernetic system itself is represented by a generic expression, which is evaluated indefinitely, time after time. It is the philosophical and mythical "eternal return". When there are no variations in the parameters of the generic expression, then it is self-evaluated and the cybernetic system reaches stability.
Generic expressions have the advantage that they do not require explicit self-feedback, since continuous repetition is part of their semantics.
Environment.
In MENTAL, the environment is a storage system, a communication system, a system where interactions take place. Everything is within the environment. System and environment are united, they constitute a unit.
Communication.
A system can communicate and interact with itself, with the environment or with other systems. Communication is something of a higher semantic level than classical linear messages. It is an exchange of meanings realized through the semantics of primitives. The content of communication is given by the types of expressions.
Memory.
With MENTAL, the two ways of accumulating information (changes of state and structure) are essentially the same.
Open or closed systems.
From MENTAL's point of view, a cybernetic system is composed of code (which forms its structure) and data (internal or external from the environment). A system can be open or closed with respect to code or data:
An open system with respect to data considers data from the environment (internal data is always considered). This is what usual cyber-physical systems usually do. If the system is closed with respect to external data, it only considers internal data.
An open system with respect to code means that the system can change its own code to achieve its goal. If it is closed, the code does not change, its structure is fixed.
Simplicity.
MENTAL allows to easily express complexity through the simplicity of primitives. With MENTAL the developments are simpler and allow to specify complex systems with interactions and interrelationships of all kinds.
Unified cybernetics.
There is no first-order or second-order or third-order cybernetics. Cybernetics must be self-reflexive. There is only one cybernetics, which allows to always include the observer and the observer of the observer, etc. There is no limit to the number of cybernetic levels.
The model of the observer is included in the model of the observed because they share the same principles.
The observer can be a human agent, another system or the system itself (observing itself), which can act and not just be a simple observer. It is a situation analogous to that of expression and meta-expression. language and metalanguage, programming and meta-programming, etc., features that are included in language.
MENTAL allows to develop applications that are self-observing or self-monitoring. And consider different aspects of the systems. The generic expression acts as consciousness, as an observer or controller of the action.
The system is able to modify its own structure depending on the circumstances of the environment to achieve its goal. This implies the existence of meta-rules that adjust the operational rules, which implies a second loop or feedback loop, this time at the structural level. There may even be meta-meta-rules, which would imply a third feedback loop, and so on. This approach corresponds to higher order systems (meta-systems, meta-meta-systems, etc.).
Learning.
With MENTAL one can develop cybernetic systems that learn, which would imply the existence of meta-rules capable of updating the base level rules and generating new rules. There can even be learning from learning, etc. The greater the number of levels of meta-rules, the more abstraction and the more consciousness (at the mental level), because the union of opposites increases.
Degrees of freedom.
What many authors wonder about are the limits or degrees of freedom that can be used in the modeling of systems and, in short, about the ultimate limits of knowledge and of what is expressible, at the transdisciplinary level, of science in general and the limitations of the formalization of mathematical and computer models.
Underlying this question is the search for a universal language that defines universal concepts and the limits of formal expressiveness. A language that provides us with a general framework that allows us to understand the world and integrate all the dispersed sciences.
In this sense, MENTAL defines the limits of the expressible (through the degrees of liberty that are the primitives) and the possible systems.
The union Cybernetics - Artificial Intelligence.
The solution to the cybernetics - Artificial Intelligence dichotomy is to turn to the archetypes of consciousness, where the common and deep foundation of both disciplines resides.
The union Cybernetics - Computer Science.
With MENTAL the same language is used in cybernetics and computer science. There is no need to translate the model into a programming language. There is no semantic gap, because MENTAL is a high-level semantic abstraction language. The model of the cybernetic system is implemented directly in the computer as a simulation model or as a production model. MENTAL can be considered a universal modeling language.
The model is the reality.
A model of a cybernetic system is an abstraction. When this model is realized with resources of the highest level of abstraction, we are capturing its essence. The essence of a system is not found in its physical building blocks but in its abstract constructs. There is then a union between internal knowledge (mind) and external knowledge (nature), between epistemology (what we know) and ontology (the object of knowledge). Both knowledges are the same thing because they share the same archetypes of consciousness. The model thus becomes a real representation of the system, precise, ideal, perfect. At the deep level reality is abstract. Therefore, the model of reality is reality itself. We can then affirm that the map is the territory or, as Bateson says, there is union between mind and nature. The "pattern that connects" mind and nature can be identified as the archetypes of consciousness.
The MENTAL models are not false, they are true because they refer to the deep reality.
But as the mental world is wider and more flexible (it has more degrees of freedom) than the physical world, we can build models only at the mental level in which we only use the primary archetypes.
Universal meta-model.
Cybernetics can be seen as an attempt to build a universal meta-model to build concrete models. MENTAL is a universal meta-model, representing all possible models (cybernetic and non-cybernetic) that can be constructed. Abstraction produces unity and MENTAL is a unifying framework of the highest level of abstraction that transcends cybernetics.
Beyond "mind as a cybernetic system".
The mind has cybernetic aspects, but the qualification "cybernetic system" does not capture its essence. MENTAL goes further, for it is a model of the mind. Feedback is not a principle of the mind, but a property that emerges from the mind's degrees of freedom.
A simple example: Air conditioning system.
tu= ... // temperature set by the user
te= ... // outside (ambient) temperature
ti= ... // indoor temperature
d=... // maximum allowable deviation in the indoor temperature
The generic expression evaluates indefinitely, and where:
Action(Cold) is the action of invoking the energy system to lower the temperature.
Action(Heat) is the action of calling the energy system to raise the temperature.
Addenda
Ashby's law of the required variety
This law is perhaps the best known cybernetic principle, because it is very simple and intuitive: "The greater the variety of actions available in a control system, the greater the variety of disturbances it is able to compensate" or "The greater the internal variety of a control system, the greater the adaptation to the external variety" or "Only an increase in internal variety can absorb external variety."
That is, a control system can control something if it has sufficient internal variety to represent it. For example, if a control system has to choose between two alternatives, it must be able to represent those two possibilities at least to be able to make a distinction or selection.
Organizational Cybernetics
It is the application of the principles of cybernetics to organizations. Its main promoter is Stafford Beer [1981, 1995]. It mainly studies the role of the design of information systems and communication channels in achieving the objectives of organizations. His Model of Viable Systems (MSV) describes the necessary and sufficient conditions for an organization to be viable and have a flexible behavior in the face of changing and complex environments, based on structure, activities, interrelationships and information flows. And it allows to detect "pathologies" from the cybernetic point of view (structural, functional and those related to information and communication systems). A viable system must have 5 keys if it is to function effectively in its environment: practice, coordination, control, intelligence and policy.
Social cybernetics and systemic therapy
Social cybernetics is an example of second-order cybernetics. Global social changes bring about changes in individual consciousnesses, which in turn bring about social changes.
With social cybernetics, systemic therapies emerge, based on feedback and human communication in a common environment. The patient is considered in his or her primary social context, the family, which is a system of communication and interrelationships.
In general, social systems may have disturbances but tend to equilibrium, as there is a "social order" based on rules, norms and customs associated with their culture.
The traditional scheme of analysis of internal processes, of a psychic type, is thus abandoned and replaced by the analysis of the processes of communicational interaction: the so-called "systemic analysis".
Gregory Bateson was one of the first to see analogies between a family group and a cybernetic system. He provoked an epistemological turn, a new way of looking at reality in a more global and systemic way, opening the field towards the trans-personal and trans-psychism.
Neocybernetics
Although the term "Neocybernetics" has been applied to the new cybernetics that emerged with Maturana and Varela's concept of autopoiesis, Heikki Hyötyniemi [2006] uses it to refer to a new theoretical framework for modeling and simulating complex systems, such as chaotic systems, natural ecosystems, economic systems, and cognitive systems (which are also cybernetic systems). These types of systems have 3 essential characteristics: 1) they are not centralized; 2) they are not linear; 3) they produce "emergent" phenomena, i.e. global functionalities appear from simple low-level processes.
It is often claimed that complex systems defy all attempts at modeling, since they act in unexpected ways, not predicted by models. However, Neocybernetics is a new way of viewing and modeling complex cybernetic systems:
It recognizes that dealing with complex systems requires concepts and tools at a high level of abstraction.
Seeks a universal model, the universal laws that govern cybernetic systems, despite the great diversity of such systems. This model has to be simple, following the principle of Occam's razor: the simplest models are the most correct and adequate. Simplicity is interpreted mathematically as linearity.
It places the emphasis on the final equilibrium state (the emergent state), rather than on the processes leading to that state, processes that are nonlinear iterations. This final state is one of dynamic equilibrium, i.e., it is a state of equilibrium between external perturbations and the negative feedbacks that compensate for them. These systems that have a state of dynamic equilibrium (e.g., between order and chaos in the Mandelbrot fractal) are the most interesting.
It aims to formalize a somewhat diffuse concept such as "emergence". Emergent (high-level) states are studied directly, rather than low-level generating principles in detail. Self-regulation and self-organization emerge from local interactions.
In linear systems there can be complexity and emergent phenomena occur, since complex systems can be the result of sequential processes carried out by agents (the basic constructors) acting iteratively.
It is explicitly oriented towards the system's environment, attempting to capture its properties from the system's responses. This is justified because a system is a reflection of its environment.
The final (equilibrium) state can be characterized as the result of an optimization process, of maximum entropy. The image of the environment is an optimized model of the properties of the environment. Since there is no centralized control, the optimization strategies of nature are distributed over local agents.
Degrees of freedom, rather than constraints, are modeled. The more complex a system is, the greater its degrees of freedom.
The emergent states of dynamic equilibrium, which the system is targeting, are represented by patterns that are of the deep type, which is where there is a stronger coupling with the environment. In contrast, traditional complexity theory is oriented to patterns as surface manifestations (such as Mandelbrot's fractal, Wolfram's seashell, and Kohonen's maps) that do not reflect the essence of the system.
To capture the emerging patterns, the time axis is removed in order to represent the final state in dynamic equilibrium.
It is a balanced model of equilibria (a higher order equilibrium). It is a map or spectrum of patterns of the relevant behaviors of a system against its environment, a collection of invariants. These patterns are attractors. And it is a model based on the local minimum rather than the global optimum.
In the dynamic equilibrium state all nodes in the network are both causes and effects.
The "universal life" is a higher order dynamic equilibrium (dynamic equilibrium of dynamic equilibria). The goal of life is to find the degrees of freedom and exploit that variation through experiences and observations.
Fractals, self-referential systems
Fractals possess a cybernetic property: they are self-referent systems, since they refer to themselves by containing themselves. Fractals are also related to consciousness by uniting the opposites of "part" and "whole". Mandelbrot's fractal has the property that its points are on the boundary between order and chaos. They are the points that converge in the recursive transformation (in the complex plane).
z → z2 + c
The golden ratio (Φ) is a fractal expression because it is self-contained, self-referential. In MENTAL it is expressed as
(Φ = 1 + 1/Φ)
Its computation is directed towards an equilibrium point that is never reached because it is an irrational number. But from the descriptive point of view it is perfectly defined.
Another self-referential number is the square root of 2:
√2 = 1 + 1/(1 + √2)
Bibliography
Bateson, Gregory. Una unidad sagrada. Pasos hacia una ecología de la mente. Gedisa, 1993.
Bateson, Gregory. Mind and Nature. A Necessary Unity. Hampton Press, 2002.
Couffignal, Louis. La Cybernétique. Presses Universitaires de France, Paris, 1963.
Guillaumaud, Jacques. Cibernética y Lógica Dialéctica. Artiach editorial, 1971.
Heylighen, Francis; Joslyn, Cliff. Cybernetics and Second-Order Cybernetics. Internet.
Hyötyniemi, Heikki. Neocybernetics in Biological Systems. Helsinky University of Technology. Control Engineering Laboratory, Report 151, August 2006. Disponible en Internet.
Luhman, Niklas. La ciencia de la sociedad. Anthropos/Universida Iberoamericana/Iteso, 1996.
Maturana, Humberto; Varela, Francisco J. El árbol del conocimiento. Debate, 1996.
Maturana, Humberto; Varela, Francisco J. Autopoiesis and Cognition. The Realization of the Living. Reidel, 1980.
Maruyama, Magoroh. The Second Cybernetics. Deviation-Amplifying Mutual Causal Processes. American Scientist 5:2, pp. 164-179. 1963. Disponible en Internet.
Ross Ashby, William. Introducción a la cibernética. Nueva Visión, Buenos Aires, 1960.
Ross Ashby, William. Proyecto para un cerebro. El origen del comportamiento adaptativo. Tecnos, 1965.
Ross Anderson, Alan (editor). Controversia sobre mentes y máquinas. Tusquets, 1984.
Sapárina, Yelena. El hombre, animal cibernético. Planeta, 1972.
Schrödinger, Erwin. ¿Qué es la vida? Tusquets, colección Metatemas, 2008.
Shannon, Claude E.; Weaver, Warren. The Mathematical Theory of Communication. University of Illinois Press, 1998.
Simon, Herbert Alexander. Las ciencias de lo artificial. Comares, 2006.
Singh, Jagjit. Teoría de la información, del lenguaje y la cibernética. Alianza Universidad, 1972.
Stafford Beer, Anthony. Brain of the Firm (The Managerial cybernetics of organization). John Wiley & Sons, 1981.
Stafford Beer, Anthony. Decision and Control. The Meaning of Operational Research and Management Cybernetics. John Wiley & Sons, 1995.
Varela, Francisco. Principles of Biological Autonomy. Elsevier, 1979.
von Foerster, Heinz. Cybernetics of Cybernetics. Or the Control and the Communication of Communication. Biological Computer Laboratory, 1974.
von Foerster, Heinz. Las semillas de la cibernética. Obras escogidas. Gedisa, 1991.
von Foerster, Heinz. Understanding Understanding. Essays on Cybernetics and Cognition. Springer, 2010.
von Foerster, Heinz. Ethics and second-order cybernetics. SEHR, vol. 4, issue 2: Constructions of the Mind, 1995. Disponible en Internet.
Watzlawick, Paul. Teoría de la comunicación humana. Interacciones, patologías y paradojas. Herder, 2008.
Wiener, Norbert. Cibernética. Guadiana de Publicaciones, 1960. Traducción de la obra “Cybernetics, or control and communicacations in the animal and the machine”.
Wiener, Norbert. The Human Use of Human Beings. Cybernetics and Society. Da Capo Press, 1988.
Wiener, Norbert. God and Golem, Inc. A Comment on Certain Points where Cybernetics Impinges on Religion. The MIT Press, 1966.
Wiener, Norbert; Schadé, J.P. Sobre modelos de los nervios, el cerebro y la memoria. Tecnos, 1969.