MENTAL
 Main Menu
 Applications
 Systemics
 Cybernetics


Cybernetics
 CYBERNETICS

"Cybernetics is the theory of command, control and communication in the animal and in the machine" (Norbert Wiener).

"The basic ideas of cybernetics can be treated without any reference to electronics and are without any reference to electronics and are fundamentally simple" (William Ross Ashby).

"The diversity of nature can be explained in the cybernetic framework" (Heikki Hyötyniemi).



The Cybernetic Theory

The origin of a new science

The word "cybernetics" comes from the Greek word "kybernetes", which means "art of steering or steering a ship" as a pilot or helmsman. From the Greek term comes the Latin "gubernator", with a similar meaning. Plato used this term in The Republic with the meaning of "art of governing", and also as "art of leading men" and, in general, "art of guiding".

The word "cybernétique" was also used in 1834 by the physicist André-Marie Ampère to refer to the sciences of government (or art of governing in politics) in his famous "Essay on the Philosophy of the Sciences," where he established a system for classifying human knowledge. "The future science of government should be called 'la cybernétique'."

In 1938, mathematician Louis Couffignal published a pioneering article in Europe magazine where he speculated on the possibilities of machines, capable of making their own decisions and even replacing man. He also drew analogies between the human nervous system and the internal structure of machines.

Cybernetics as a science began in 1948 with the publication of the mathematician Norbert Wiener's work "Cybernetics: or Control and Communications in the Animal and the Machine". This work became a best seller and initiated one of the most spectacular and influential scientific movements. Wiener is considered the father of cybernetics.

Wiener's cybernetics is an abstraction inspired by practical problems raised during World War II, specifically the need to develop automatic control mechanisms for military air defense equipment. Wiener constructed a general theory of organization and control systems, with the concept of feedback (associated with the achievement of a goal) as its most important foundation, a general concept also present in nature, in living beings and in social systems.


The definitions

There are many definitions of cybernetics: From all these definitions, we extract 3 fundamental concepts, which correspond to universal characteristics of many systems, including natural ones:
  1. Feedback (feedback), also called "feedback" and "feedback", based on the interaction of a system with the environment.

  2. Teleology: the orientation of a system to the achievement of a certain goal.

  3. Self-regulation: the ability of a system to control itself, autonomously to achieve its goal.

Feedback

When the environment of a system changes, the system changes to adapt to the new conditions of that environment, always goal-oriented.

In feedback, the output (or part of it) produced by a system becomes input. What is called a "feedback loop" occurs. According to Norbert Wiener, "Feedback is a method of controlling a system by which the results of its past performance are reinserted into it." Feedback is considered "the secret" of nature.

It is necessary to differentiate between linear process and process with feedback loop: There are several types of feedback. The most prominent are negative feedback, positive feedback, and compensated feedback:
Teleology

Deterministic systems are defined from initial conditions. Teleological systems are conditioned on the achievement of a goal or the attainment of a final result. These conditions (initial or final, respectively) influence the intermediate states. When the goal is defined, there is "equifinality," i.e., different initial states lead to the same final state. The final state is also called "attractor".

When a system is in equilibrium, its objective is to maintain it, so the system returns to that state after an internal or external perturbation.

In complex cybernetic systems (such as organisms or organizations) the objectives are presented as a hierarchy. There is a primary or essential goal (such as survival) and subsidiary or secondary goals, where one goal at one level controls others at the next lower level. A simple example is that of a room regulated by a thermostat with an infrared sensor that detects the presence or absence of people, its objective being to save energy.


Self-regulation

The state of a cybernetic system is defined by a number of essential parameters. To achieve its goal, a cybernetic system is self-regulating, i.e. it uses an internal control mechanism (also called regulator, controller, regulatory system or control system) to adapt to environmental conditions and perturbations by changing its essential parameters. Self-regulation implies autonomy. Nature is the most extraordinary cybernetic system, as it is always self-regulating.

There are 3 control mechanisms:
  1. Buffering (damping).
    Buffering is the passive absorption of disturbances. For example, in a room controlled by a thermostat, a wall acts as a buffer against outside temperature changes.

  2. Feedback.
    In feedback there is a regulator that handles disturbances after it affects the essential parameters of the system. It has the advantage that it is easier to implement, but has the disadvantage that it must allow for a deviation before taking action.

  3. Feedforward (direct feed).
    In direct feedforward there is a controller that handles disturbances before they affect essential system parameters. For example, a room controlled by a thermostat, with a sensor outside the room alerting the thermostat to variations in the outside temperature so that it makes a decision before it affects the inside of the room. This control system has the advantage of anticipation, but the disadvantage that it is often difficult to implement in practice. However, this system becomes necessary when disturbances occur so quickly that a feedback-type reaction would come too late.
The presence of control in a cybernetic system means that the system can be conceived of as two interacting units: the controller and the controlled. In a feedforward system, the controller receives the input signal from the environment and passes a control signal to the controlled system. In a feedback system, the controlled system receives the signal from the environment and passes the same signal to the controller, which returns a control signal to the controlled system.

For Wiener, every being (biological, artificial or mechanical) can be defined by the nature of information exchanges with the environment. The purpose of cybernetics is to understand information behaviors. Communication is conceived as an exchange of information with the environment. The true nature of all beings is found in their communication relationships. Cybernetic communication is conceived as the behavior of information exchanges.


Homeostasis

To Walter Cannon we owe the concept of homeostasis (from the Greek "homo", analog and "stasis", stability). This concept dates from 1939, prior to Wiener's work. It is the characteristic of a system that regulates itself internally to maintain its stability (within certain limits) in the face of external (environmental) or internal perturbations. This concept is generic and applies, not only to cybernetic systems, but also to biology (living organisms), psychology, sociology, etc.
Entropy. The thermodynamic view of cybernetics

The most universal framework governing all physical systems is thermodynamics. And this science has two laws. The first law of thermodynamics is the principle of conservation of energy in an isolated system. The second law states that entropy always increases. The traditional conception of entropy is that it is a measure of the level of disorder in a system. The greater the disorder, the greater the entropy. The inverse concept of entropy is neguentropy, which is a measure of order. Therefore, according to the second law, every system tends towards greater disorder, towards less order, entropy increases and neguentropy decreases.

Entropy is a concept introduced by Rudolf Clausius in the 1850s and later formalized at the mathematical level by Ludwig Boltzmann, relating it to probability. The word "entropy" comes from Greek and means "transformation".

All systems, even cybernetic ones, are governed by the law of entropy: every system always "pursues" to increase its entropy and moreover it does so at the maximum possible speed.

There are different interpretations of the entropy law, depending on the system under consideration: The goal of a cybernetic system is an equilibrium state, which is a state of maximum probability and minimum information. When the system deviates from the equilibrium state, the deviation produced is information. The better and more efficient the control system (which brings the system back to equilibrium), the greater its entropy variation per unit of time.

It is often said that entropy (disorder) always increases in the physical world and that neguentropy (order) always increases in biological systems. But this is not true: in both cases the equilibrium state is pursued, the most probable state, the one with the least information. Therefore, entropy should not be associated with disorder, but with maximum probability and minimum information. We can say that entropy is the opposite of information: the higher the entropy, the lower the information.


Other characteristics of cybernetic systems
Second-order cybernetics

In 1972, anthropologist Margaret Mead, president of the American Society for Cybernetics (ASC), devoted one of her speeches to the topic "Cybernetics of Cybernetics." Heinz von Foerster [1974] suggested changing this name to "Cybernetics of the second order". It refers to the cybernetics of observing systems, to differentiate it from first-order cybernetics or cybernetics of observed systems (Wiener's cybernetics).

Traditional (first-order) cybernetics considers that reality is objective and exists independently of the observer. Second-order cybernetics is of a reflexive type, it includes the observer, who can be considered a meta-system that observes, controls or interacts with the base cybernetic system. An example of this type is a social system.

Second-order cybernetics is primarily inspired by advances in modern physics. At the quantum level, the observer influences or affects the observed. And at the relativistic level, the observer perceives according to his reference system.

First-order cybernetic systems do not change their goals as long as they are not given new instructions to do so, from the outside. For example, a thermostat (the external command is to set a new temperature). Second-order cybernetic systems can dynamically modify their objective, autonomously.

Wittgenstein (philosopher and logician), Warren McCulloch (neurophysiologist), Gordon Pasq (psychologist), as well as Humberto Maturana and Francisco Varela (biologists) also contributed to cement this concept.

For von Foerster [1974]:
Third-order cybernetics

Despite von Foerster's above statement, some authors speak of a third-order cybernetics, with two characteristics:
  1. There is mutual interrelation between the base level and the target level, thus forming an interactive circularity and a true higher unity. The observer and the system co-evolve. The observer must change his behavior in order to be able to recognize the observed system; if he does not change, he will cease to recognize it.

  2. The system is aware of its environment and recognizes how it self-regulates to adapt to that environment.
In second-order cybernetics, there is causality from the goal level to the base level. With third-order cybernetics, the causal cycle is closed. This implies that the observer not only affects or influences the observed, but that the observed changes the observer.

An example of third-order cybernetics is an orchestra, in which each musician not only listens to his own sound but also to that of all his fellow musicians, and adapts his own sound according to what he hears from his fellow musicians.


The cybernetics of living beings: autopoiesis

Living beings can be considered cybernetic systems whose goal is stability and evolution, within an environment with which they interact.

Magoroh Maruyama [1963] (before the emergence of the concept of second-order cybernetics) claimed that living things behave like a "second cybernetics". All living things depend for their survival on two processes: 1) Morphostasis, the negative feedback that tends to stabilize it; 2) Morphogenesis, the positive or amplifying feedback. These two processes balance each other.

The term "autopoiesis" is a neologism created in 1971 by Humberto Maturana and Francisco Varela (described in their work "The Tree of Knowledge"). It comes from the Greek: "auto" (self) and "poiesis" (creation or production).

Autopoiesis explains the organization of biological systems, the distinctive capacity of living beings to maintain and evolve their own internal organization, their structure, to self-produce, to self-regenerate by means of a circular, self-referential or re-entrant organization. An autopoietic system produces itself continuously using resources from the environment, in such a way that producer and product, doing and being, subject and object, are the same thing.

Living beings are self-referent autonomous beings. Not every autonomous entity is a living entity. Self-reference is a type of autonomy and is what characterizes living beings. Living systems are simultaneously autonomous systems and dependent on the environment.

The theory of autopoiesis, while relying on cybernetic theory, contributes two important concepts:
  1. Structural coupling.
    It refers to the capacity of a living being to evolve, to restructure, to constantly change its structure in a flexible and congruent way with the modifications of the environment. Its structural dynamics, its possible structural changes are predetermined. There is structural determinism: what happens to the living being depends on its structure. This circular structural coupling, of constant dialogue being-environment, occurs at multiple levels. "Autopoietic systems have neither inputs nor outputs. They can be perturbed by independent events and undergo internal structural changes that compensate for these perturbations" [Maturana & Varela, 1980].

  2. Operational closure.
    Living beings are closed systems from an operational or functional point of view. For life to be possible, it is necessary for the living being to be closed to the environment, in such a way that, in the face of the dynamics of the environment, its functionality, its identity, its autonomy, its totality remain unchanged. Operational closure is due precisely to its self-referential quality. The nervous system of the living being has operational closure. "The circularity of living and social systems is indeed Ariadne's thread that allows us to understand their capacity for autonomy" (Francisco Varela).
That is, living beings are structurally open and functionally closed. There are 3 types or orders of autopoietic systems: 1) cells; 2) organisms (cellular aggregates), which have nervous systems; 3) aggregates of organisms (families, societies, colonies, beehives, etc.), whose main characteristic is not their components (organisms), but the relationships between them. According to Maturana and Varela [1980], the establishment of an autopoietic system is not a gradual process; a system is either autopoietic or it is not.

At a fundamental level, the goal of an autonomous or autopoietic system is survival, i.e., the maintenance of its essential organization. And there are subsidiary goals such as: maintaining its temperature, eating, etc. that contribute to its survival.

Artificial systems, such as a thermostat or an autopilot, are apparently autonomous, but they are not really autonomous because their primary purpose is implemented by their designers. These systems are said to be "allopoietic". Their function is to produce something other than itself.

Self-reproduction can be considered as a special case of autopoiesis, where the self-produced components are not used to regenerate the system, but to form a copy of the system.

The concept of autopoiesis has overflowed the boundaries of biology to be applied in other domains such as sociology, anthropology, psychotherapy, etc., having become a worldview, an important concept for investigating reality and for modeling many types of systems. For example, the sociologist Niklas Luhman [1996] has applied it to the study of societies in contexts of contingency and risk. Luhman's claim is to create a super-theory applicable to all social phenomena. His current work is considered one of the most important theoretical studies produced in the field of sociology. Highlights are: However, Varela rejected that autopoietic concepts could be applied to social systems, restricting it exclusively to biological systems.


The mind, a cybernetic system

According to Gregory Bateson [1993, 2002], the mind is a cybernetic system: an aggregate of interacting parts that exhibit a feedback structure. According to this author, where there is feedback there is mind. The complexity of systems with feedback (and, therefore, with mind) ranges from a simple thermostat to the human mind and beyond: the universal mind. The individual mind is only a subsystem of the universal mind. There is also a mind in the social system and in the planetary ecology.

Bateson speaks of an "ecology of mind," where there is a "connecting pattern" of mind and nature, inner and outer. This connecting pattern is a meta-pattern, a pattern of patterns that eliminates the dichotomy between mind and nature.


The Valuation of Cybernetics

Cybernetics, a universal science?

Many authors consider cybernetics to be a universal science or an interdisciplinary or transdisciplinary science, since cybernetic systems are found in all areas of knowledge. Wiener himself stated in his seminal 1948 work that "the most fruitful areas for the development of the sciences were those that had been forgotten as no man's land between various established fields." What can be stated about cybernetics is the following:
Philosophy of cybernetics

Apparently, cybernetics, by adopting a scientific and mechanistic approach in all domains (including social ones), could be interpreted as an attempt to eliminate philosophy, as happened with positivism from Comte onwards, which based knowledge on empiricism and scientific verificationism.

Heidegger in "The End of Philosophy and the Task of Thinking" (1964) attacks cybernetics as the consummation of the technical and superficial thinking of modernity, with the consequent oblivion of being. He calls for a deeper kind of thinking and the search for the being that is hidden behind all entities. And he ironizes: "While classical philosophy fades away, cybernetics becomes a philosophy for the twentieth century".

Wiener, who studied philosophy, insisted on the need for a cybernetic philosophy. And indeed, Cybernetics raises a number of philosophical problems, including the following:
  1. The principles.
    Cybernetic principles are very general or universal. Philosophy is interested in the universal, in the fundamental principles of reality.

  2. Epistemology.
    Cybernetics provides an epistemology, a theory of knowledge, based on the substitution of the abstract for the "real". Philosophy aims to apprehend the true essence of reality and thus achieve the unification of knowledge.

  3. The man-machine relation.
    Cybernetics deals with autonomous systems, which make their own decisions, even apparently "thinking". The philosophy is speculative, so we ask: can machines think, is man a machine or one of the possible implementations of a cybernetic system, can cybernetics help us understand ourselves better, will machines come to dominate man, and so on.

Cybernetics and consciousness

Cybernetics is a remarkable approach to the subject of consciousness by bringing together a whole series of opposites: It is a paradox, not conceivable with traditional dichotomous thinking (true/false, yes/no, etc.), but with a model of thought in which analytical and synthetic thinking, linear and circular, etc., are compatible. In short, with the unity of opposites. This dialectic of opposites is at the basis of consciousness and life. Circularity, although an essential concept, is only a particular case of the general concept of "union of opposites".

Gregory Bateson [1993] considered cybernetics a key discipline, very important: "I think that cybernetics is the greatest fruit of the tree of knowledge that mankind has borne in the last 2000 years". But now we can better understand its importance because of its close relationship with the subject of consciousness.

The symbol that best represents the union of opposites is the Ouroboros, the mythical serpent that engulfs its own tail, thus forming a circle. The circle, having neither beginning nor end, is the symbol of God, of the soul, of consciousness, of the absolute, of eternity, of perfection, of indivisible totality, of unity, of the immutable, of the indestructible essence and of the permanence of all things. And the serpent symbolizes:
Ouroboros

In some representations the serpent is depicted with a light and a dark half, at the same time referring to the dichotomy of cyclical processes (day-night, etc.).


The problematic of cybernetics

Cybernetics has been criticized and even questioned for different reasons:
Cybernetics vs. Artificial Intelligence

Cybernetics and Artificial Intelligence have been in dispute over the construction of intelligent systems. The cybernetics movement started first (1948, with Wiener), with its golden age being the 1950s. Artificial Intelligence (AI) was officially born in 1956 at the famous Dartmouth College Conference (Hanover, New Hampshire) and dominated mainly between 1960 and 1985. However, the boundary between cybernetics and AI is fuzzy. For example, when a system reacts appropriately with its environment, the illusion of intelligence emerges.

Moreover, because of the problems described above, many authors and researchers have proposed that AI should be the science devoted to the theory and development of systems that behave intelligently. In fact, cybernetics conferences have in practice become AI conferences. However, around the year 2000, there was a movement back to cybernetics, mainly for 3 reasons:
  1. Because of the need for a general, integrating or unifying conceptual framework for a whole series of more or less dispersed disciplines.

  2. Because of the repeated failures of AI, which has disappointed the great expectations placed in it.

  3. Because a deep AI system requires interaction with its environment, so we need a "cybernetic AI".
Today, however, the goals of AI and cybernetics are increasingly close, to such an extent that they tend to be confused. Sometimes even the acronym AI (Artificial Intelligence) is interpreted as "Agent Intelligence".


Cybernetics and Computer

Today, computer science has become the most useful technical resource for researching and developing cybernetic models, both real and simulated. In fact, the evolution of cybernetics (such as Artificial Intelligence) is due, in large part, to computers and their simulation programs. The computer itself is a cybernetic system, and is therefore the object of research by cybernetics. It is paradoxical that a cybernetic system such as the computer is used to investigate cybernetic systems. But the computer is really a meta-machine, a machine capable of being configured as any particular machine.

Simulation programs (such as Simula, Simscript, Slen, Nedis, etc.) each use a different language. And a computer system also requires a language. And here we have a semantic gap, a semantic gap between the two languages. Anyway, there are translators that convert the description of a cybernetic system into computer code.


MENTAL, a Language for Cybernetics

Cybernetics has had a great impact precisely because of its close relationship with the subject of consciousness, because of the union of opposites. But it lacks a formal language capable of expressing its fundamental concepts.

MENTAL is a universal language based on consciousness, on the union of opposites that can be applied to cybernetics, because it allows to easily express and implement feedback, communication with the environment and self-regulation, including the modification of its own structure. MENTAL is at a higher level of abstraction than cybernetics, so cybernetics is nothing more than an application of MENTAL. Moreover, MENTAL, due to its universality, can also be applied to Computer Science and Artificial Intelligence.

Addenda

Ashby's law of the required variety

This law is perhaps the best known cybernetic principle, because it is very simple and intuitive: "The greater the variety of actions available in a control system, the greater the variety of disturbances it is able to compensate" or "The greater the internal variety of a control system, the greater the adaptation to the external variety" or "Only an increase in internal variety can absorb external variety."

That is, a control system can control something if it has sufficient internal variety to represent it. For example, if a control system has to choose between two alternatives, it must be able to represent those two possibilities at least to be able to make a distinction or selection.


Organizational Cybernetics

It is the application of the principles of cybernetics to organizations. Its main promoter is Stafford Beer [1981, 1995]. It mainly studies the role of the design of information systems and communication channels in achieving the objectives of organizations. His Model of Viable Systems (MSV) describes the necessary and sufficient conditions for an organization to be viable and have a flexible behavior in the face of changing and complex environments, based on structure, activities, interrelationships and information flows. And it allows to detect "pathologies" from the cybernetic point of view (structural, functional and those related to information and communication systems). A viable system must have 5 keys if it is to function effectively in its environment: practice, coordination, control, intelligence and policy.


Social cybernetics and systemic therapy

Social cybernetics is an example of second-order cybernetics. Global social changes bring about changes in individual consciousnesses, which in turn bring about social changes.

With social cybernetics, systemic therapies emerge, based on feedback and human communication in a common environment. The patient is considered in his or her primary social context, the family, which is a system of communication and interrelationships.

In general, social systems may have disturbances but tend to equilibrium, as there is a "social order" based on rules, norms and customs associated with their culture.

The traditional scheme of analysis of internal processes, of a psychic type, is thus abandoned and replaced by the analysis of the processes of communicational interaction: the so-called "systemic analysis".

Gregory Bateson was one of the first to see analogies between a family group and a cybernetic system. He provoked an epistemological turn, a new way of looking at reality in a more global and systemic way, opening the field towards the trans-personal and trans-psychism.


Neocybernetics

Although the term "Neocybernetics" has been applied to the new cybernetics that emerged with Maturana and Varela's concept of autopoiesis, Heikki Hyötyniemi [2006] uses it to refer to a new theoretical framework for modeling and simulating complex systems, such as chaotic systems, natural ecosystems, economic systems, and cognitive systems (which are also cybernetic systems). These types of systems have 3 essential characteristics: 1) they are not centralized; 2) they are not linear; 3) they produce "emergent" phenomena, i.e. global functionalities appear from simple low-level processes.

It is often claimed that complex systems defy all attempts at modeling, since they act in unexpected ways, not predicted by models. However, Neocybernetics is a new way of viewing and modeling complex cybernetic systems:
Fractals, self-referential systems

Fractals possess a cybernetic property: they are self-referent systems, since they refer to themselves by containing themselves. Fractals are also related to consciousness by uniting the opposites of "part" and "whole". Mandelbrot's fractal has the property that its points are on the boundary between order and chaos. They are the points that converge in the recursive transformation (in the complex plane). The golden ratio (Φ) is a fractal expression because it is self-contained, self-referential. In MENTAL it is expressed as Its computation is directed towards an equilibrium point that is never reached because it is an irrational number. But from the descriptive point of view it is perfectly defined.

Another self-referential number is the square root of 2:
Bibliography