MENTAL
 Main Menu
 Fundamentals
 Principle of Economy


Economics Principle
 PRINCIPLE OF
ECONOMY

"Nature always proceeds in the simplest or most economical way" (Aristotle).

"Ockham's razor is the supreme maxim of philosophy" (Bertrand Russell).

"Science can be defined as the art of systematic super-simplification" (Karl Popper).



The Principle of Ockham's Razor

Ockham's razor principle, also called the "principle of parsimony" and "principle of economy of thought", is a thesis elaborated by William of Ockham, a 14th-century English Franciscan friar, philosopher, theologian, writer, politician and scholastic thinker. William of Ockham, along with John Duns Scotus (his teacher) and Thomas Aquinas, are the most prominent philosophical figures of the early Middle Ages.

This principle is expressed by Ockham in several ways: Behind these phrases underlies the principle of conceptual or intellectual simplicity, under two aspects: There are two interpretations of the origin of the term "razor":
  1. It comes from the fact that Ockham employed his principle sharply and precisely on numerous problems or subjects, "dissecting" them to simplify them.

  2. It comes from the fact that, metaphorically, Ockham "shaved" with a razor the beards of Plato, since with its application he obtained a remarkable ontological simplicity, as opposed to the Platonic ontology (the theory of Ideas or Forms), which was very complex by including in it all kinds of entities. Ockham thus eliminated many unnecessary entities, a form of rejection of Platonism.
Ockham was interested and passionate as a young man about logic, a science he considered fundamental to the advancement of knowledge and understanding of reality. For Ockham, the simple is the logical, and logic must be used to simplify.


Ockham's thought

Although Ockham's thought is subject to many interpretations, it does appear that he applied logic and its famous principle to different subjects in an attempt to simplify and clarify them. To do this, he eliminated many entities or concepts, especially those of the scholastic philosophers, which he considered unnecessary. He also applied logic and his "razor" to separate different concepts or entities to simplify them.

In separation of concepts, he distinguished between: He also used his "razor" to eliminate (at his discretion) unnecessary entities: Ockham has had a great influence on science and philosophy:
Evaluation of Ockham's razor principle by the authors

Ockham's razor principle has been admitted, implicitly or explicitly, by almost all philosophers and scientists. But it has also been qualified (so as not to push simplicity to extreme limits) and, in very few cases, questioned. The principle of Ockham's razor goes back to Aristotle:
Ocam's razor and related concepts

Because of its general nature, the principle of Ockham's razor is related to several concepts that are also general:
The Principle of Economy in Nature

Nature follows the principle of economy, makes use of as few resources as possible and always uses the simplest laws. Some quotes: For example:
The Principle of Economy in Science

The principle of conceptual economy is a basic principle of the scientific method. It is the foundation of so-called "methodological reductionism", which also includes ontological and epistemological features.

Science always advances in the direction of simplicity and unification of concepts. Science prefers the simplest explanation that is consistent with the experimental data available at any given time. But this simplest explanation may later be rejected when new data become available.

An illustrative example is that of planetary orbits. Copernicus postulated that the orbits were circular, the simplest hypothesis. Kepler, with more data, deduced that the orbits were elliptical, with the Sun at one of their foci, which led him to deduce his famous 3 laws. But, in any case, Copernicus' theory was a good approximation to the truth.

Some quotes on science and simplicity: Some examples of the tendency and triumph of simplicity or the principle of economy in the history of science are:


Physics
Mathematics

Mathematics has tried to base itself on a few general or universal axioms. The most prominent examples have been Hilbert's universal axiomatic program and the logical program (also axiomatic) of Russell and Whitehead's Principia Mathematica This claim was shown to be impossible when Gödel presented his famous incompleteness theorem for formal axiomatic systems.


Informatics

In the face of the increasing complexity of computer systems, there is the principle called "KISS" (Keep It Small and Simple), also interpreted as "Keep It Simple, Stupid".


Linguistics

In 1995, Chomsky presented his "Minimalist Program", a general conceptual framework for general language theory research, which attempts to explain linguistic phenomena with as few conceptual resources as possible.


Biology

A simple evolutionary algorithm −natural selection− suffices to explain evolution, with no need to resort to supernatural explanations. It is the simplest principle capable of explaining complexity.


Philosophy

The materialist philosophy −which holds that everything is matter− applies, consciously or unconsciously, Ockham's razor. The opposite view also applies it by holding that everything is consciousness, and that matter is manifested consciousness.


The Philosophy of Simplicity

The two types of simplicity

A distinction must be made between conceptual simplicity and definitional simplicity: Ockham's razor is a horizontal principle, of selection among several alternatives. Einstein's razor is a vertical principle, of not going beyond the conceptual.

Five significant examples.
  1. Propositional logic.
    At the conceptual level we choose the concepts of "negation" and "conjunction" (or alternatively, "negation" and "disjunction"). At the definitional level we can choose the unique operation called "Peirce's arrow" or "Quine's dagger: NOR (Not or) = pq = ¬(pq) = ¬p∧¬q), from which the rest of the logical operations can be defined, but this definition is complex and unintuitive. We can also choose Sheffer's slash: NAND (Not and) = p|q = ¬(pq) = (¬p∨¬q).

  2. Category theory.
    Category theory is based on a single generic type concept: morphism, which admits many particular interpretations (function, process, link, link, sequence, rule, etc.). The theory becomes enormously complex.

  3. Principia Mathematica, by Russell and Whitehead.
    Russell and Whitehead opted for logicism in wanting to ground mathematics in logic. The result was a failure due to the resulting complexity. Moreover Gödel showed that in his famous 1931 paper ("On formally undecidable sentences in Principia Mathematica and related systems") that it is not possible to ground mathematics from mathematics itself: with a formal axiomatic system. The grounding has to come from a higher level of abstraction.

  4. The Turing machine.
    The Turing machine (TM) is a theoretical machine that is based on non-generic and implementer-like concepts, which limits expressiveness. The TM does not follow Einstein's razor principle because it goes beyond conceptual simplicity because programming algorithms (even the most simple ones) with TM primitives becomes very complex and laborious, impractical. Precisely one way to detect whether one has gone beyond conceptual simplicity is to see if the subject gets complicated.

    MT has been the foundation and inspiration for the creation of computers, of practical computing, because of its simplicity of implementation. But the theoretical foundation of a science has to be simple and not limited a priori. Therefore, MT cannot be the foundation of theoretical computer science.

  5. Church's lambda calculus.
    It reduces everything to functional expressions, functions that return functions as results. This leads to complexity when expressing concepts such as numbers, predicates, logical operations and arithmetic operations. Also functions are not recursive and require an artificial formalism: the fixed-point AND combinator (Curry's combinator).
One must follow the principles of Ockham's razor and Einstein's razor. Apply supreme conceptual simplicity without going towards further abstraction by means of the definitional.

Definitional simplicity is only justified in the case of physical implementations, as in the case of MT. Turing himself applied it using NAND gates for the simulation of neural networks. This topic is described in the paper "Intelligent Machinery, A Heretical Theory", one of the pioneering papers in artificial intelligence, written in 1948, but published long after his death.


Abstraction vs. simplicity

We perceive reality as fragmented, dispersed and complex. But complexity is only apparent, because behind it hides simplicity. What appears complex (at the superficial level) is really the expression or manifestation of the underlying simplicity (at the deep level). And what appears to be separate at a superficial level is actually unified at a deep level.

The essence of reality is that which is common to all, that which provides unity to diversity. This unity is permanent, it never changes, it is outside of space and time, it is the supreme stability. Everything has come out of unity and tends to return to it in search of equilibrium and stability. Unity is the state that grounds everything, the level from which all possibilities can manifest.

To simplify reality the mechanism of abstraction is applied. Abstraction dispenses with the accessory and focuses on the fundamental and generic mechanisms.

Abstraction makes it possible to extract common properties in the various domains. For example, the mathematical concept of group is a powerful abstraction: the common property is the possibility of combining two elements of a set to obtain another element of that same set.

Abstraction allows us to contemplate things in a broader context, with greater awareness and greater simplicity. The essence of reality lies in supreme conceptual simplicity. Abstraction should go no further.


Simplicity in computing

Computers have changed everything. They have introduced a new paradigm, a new way of looking at reality. What computing has taught us is the power of simplicity:
  1. A small number of instructions make it possible to produce (by combinatorics) potentially an infinity of possible programs. Computers were born based on an abstraction: a theoretical machine which today we call "universal Turing machine" that allowed to define and formalize the concept of computation. The key ideas were two: 1) the simplicity of its computational model; 2) the concept of a program stored in memory, which made it possible to make flexible something that could not be flexible: the hardware.

  2. With only two values (0 and 1) we can represent all kinds of contents: numerical data, texts, sounds, videos, graphics, etc. In this sense, boundaries have been broken down, although a digital file needs an external interpretation. A digital file has no semantics, it is just a sequence of zeros and ones. Although semantics is always present because semantics is always at a higher level than syntax and underlies it. Semantics in this case is an entity capable of adopting or manifesting itself in two states, which we symbolize as 0 and 1, but any other pair of symbols could be used.
Thanks to this simplicity, computers have become a universal tool for science, as it allows modeling all kinds of phenomena of nature and the processes of calculation and reasoning.


The paradoxes of simplicity

Paradoxically, the search for the simple is a very complex task because the simple is hidden behind the superficial manifestations, which are apparently complex. Another paradoxical aspect of simplicity is that the criterion for determining which is the simplest theory is a question that can be difficult to resolve because simplicity is subjective. There is no general definition of what is simple, and what is also simple. This is justified by the close relationship between simplicity and consciousness. Consciousness cannot be explained and neither can the simple; one must turn to intuition. However, in some specific contexts or domains there does exist an objective criterion of simple. For example, in the area of computational information, the algorithmic complexity of a piece of information is defined as the length of the shortest program that produces that information. The shorter the program length, the greater the simplicity. Solomonoff is considered the father of the concept of algorithmic complexity, a concept formalized by Andrei Kolmogorov and extended by Gregory Chaitin, all during the 1960s.

The final paradox is that the more we broaden the domain to be considered, the greater the simplicity. A "theory of everything" is simpler than a theory of a particular domain. Expressed in other words: with less you get more. In the limit, a "theory of everything" should be extraordinarily simple. And if a "theory of everything" also includes possible worlds, then supreme simplicity would be achieved.

Bibliography