MENTAL
 Main Menu
 Properties
 MENTAL, a Universal Open Standard


MENTAL, a Universal Open Standard
 MENTAL, A UNIVERSAL
OPEN STANDARD

"Standards should focus on what we call 'the big ideas'" (Christopher T. Cross).

"Only useful standards will survive" (Stephen Fosket).



The Standards

The concept of standard

The term "standard" −in English, "standard"− is given to something that is considered the model, norm, pattern or reference to be followed in a certain domain. The properties associated with standards are:
The importance of standards

Standards allow the world to function consistently and more efficiently. If standards did not exist, everything would most likely be in chaos, a total mess. Standards encourage progress in society, science and industry.

Standards are everywhere and we benefit from them without hardly realizing it. Prominent examples are:
  1. Road signs, which are the same all over the world.

  2. The international system of units. Heir to the old metric decimal system, it is the system of units used in almost all countries. It allows the exchange of scientific, cultural, commercial, etc. data. Previously, each country −sometimes even each region− had its own system of units.

    It includes 7 basic physical units: meter (length), kilogram (mass), second (time), ampere (intensity of electric current), kelvin (temperature), candela (luminous intensity) and mole (quantity of substance). Of the basic units there are multiples and submultiples of 10, which are expressed by prefixes (such as deca, hecto, kilo, mili, giga, tera, etc.). Previously, units were divided into 3, 12, 16, ... parts, which made arithmetic operations difficult. Lavoisier went so far as to state that "Nothing greater or more sublime has come from the hands of man than the decimal metric system".

  3. Positional numeric coding, a standard considered the greatest achievement of civilization.

Types of standards

We can distinguish the following types:
The Standards in Math

There are no official standards in mathematics because this discipline is considered an open language in which any abstract concept can be formally expressed as long as its meaning is explained. Nevertheless:
The decimal system of numerical representation

The oldest known positional system in history is the sexagesimal (base 60) or Babylonian system (circa 1800 BC). The current system of numerical representation based on powers of 10 is considered one of the greatest advances of mankind throughout its history.

Behind this system is the principle of conceptual reflection, that is, a concept that applies to itself, (concept of concept), which is an essential mechanism of consciousness. Starting from the unit (concept of order 0), we have 10 units (concept of order 1), 10 units of 10 units (concept of order 2), and so on. Every time we refer to a number, we are tuning (consciously or unconsciously) to the universal mechanism of conceptual reflexivity.

The positional principle gave rise to the concept of "zero", without which there would be neither the concept of negative number nor imaginary numbers. However, zero was already discovered by the Mayan civilization some 600 years before its appearance in India, and they represented it with a shell.

The base-10 positional numbering system was invented in India around 450 AD. Because it was spread by the Arabs, it is called the "Indo-Arabic system". In Europe it was introduced around the year 1000 AD, although due to some resistance it took several centuries before its widespread use. The resistance came from the accountants, who did not want its adoption to upset the clergy, who considered the system to be "diabolical" because of its Arabic origin.

The Italian Fibonacci (Leonardo of Pisa) was the main popularizer of the positional system. He spent a long period in Bukhia, near Algiers, where his father was a customs clerk. There he learned the numbering system and became aware of its great advantages, mainly for simplifying calculations. Back in Italy, he published in 1202 the treatise "Liber Abaci" (Book of the Abacus).

A curious fact is that Fibonnacci kept the order of the digits according to the Arabic script, that is, from right to left. It is also curious that Fibonacci called "Book of the Abacus" to a book that tried to put an end to that instrument of manual calculation. He did so to disguise its contents, because he knew that the powerful abacus-calculists of the time would oppose it. And, in fact, when its contents became known, it caused uneasiness among them, because they saw that their trade was in danger.

European Christians and Catholic authorities were so attached to their archaic systems, as well as the resistance of professional calculators (mostly clerics) to novelties, that it was necessary to wait centuries for its implementation and dissemination. The Church did nothing to favor the democratization of calculus because this implied the loss of its monopoly on teaching and the consequent diminution of its power and influence.

Dispute between an abacist
and an algorist
(Margarita Philosophica, 1508).

Fibonacci's book did not manage to spread fully in Europe until the end of the 16th century for three reasons: 1) The book was not "official", as it was conceived outside academic circles; 2) It encountered strong opposition because of the sense of the writing of the digits, until it was finally accepted; 3) The opposition of calculators.

The new system replaced the Roman system of representation used in Europe at that time. The Roman system did not allow for complex operations. It was easy to add and subtract, but for multiplication and division it was necessary to resort to professional calculator-abacusts. The Roman numerals (I, V, X, L, C, D and M) had absolute value, a value by themselves, and there was no zero.

The invention and democratization of positional numeration has had incalculable consequences in human society, facilitating the development of science and technology, and has allowed the mechanization of arithmetic and mathematical calculation in general.

There has been no "Tower of Babel" of numbers. While there are more than 4,000 languages and several dozen alphabets and writing systems, there is currently only a single system of numerical representation. It is a kind of numerical Esperanto.


Non-standard set theories

There have been alternative theories to the standard ZFC of set theory. The most prominent of these are:
The Standards in Computer Science

In computing, a standard is defined as a set of standardized rules that describe the requirements that a product or service must meet, with the objective of establishing a base mechanism so that different elements (hardware or software) that use it are compatible with each other.

The problem of standards in communicating information is as old as civilization itself. The fact that the Internet has become a digital platform accessed and interacted with by millions of people makes Internet standards of decisive importance. The web has become the global space and the global operating system, and standardization has turned to the web because of its importance and significance. Since 1996, more than 100 standards have been created.

A developing technology with a great future is Cloud Computing (computing in the cloud). It is a new paradigm that makes it possible to offer computing services over the Internet. The cloud is the metaphor for the Internet. But two problems are limiting its adoption: lack of interoperability and security risks. "Now you can't exchange things between clouds and that has to change" (Vinton Cerf). Vinton Cerf is considered one of the "fathers" of the Internet and inventor of the TCP/IP protocol (Transmission Control Protocol/Internet Protocol).

Today, the goal is to raise the semantic level: to move from information to knowledge, from syntax (pure form) to semantics (meaning). For years, the so-called "Semantic Web" has been pursued: with semantic relations, semantic searches, use of ontologies (set of concepts and relations between them), with the aim of achieving semantic interoperability.

A standard codification for the representation of all types of data, information and knowledge, as well as their meta levels, has not been achieved so far, starting with the most basic, which is the lack of a standard operating system. Achieving a universal codification for this, as was done with numbers, would represent an enormous conceptual and evolutionary leap for mankind.


Interfaz

In general, an interface is a formal system that facilitates and simplifies communication between a user and a system, or between two systems (or between two components of a system).

In computer science, an API (Application Programming Interface) is the application programming interface: a set or library of functions (in functional programming), procedures (in imperative programming) or methods (in object-oriented programming) that constitute an abstraction layer to be used by other software for application development. Almost all programs rely on the APIs of the underlying operating system to perform basic functions, such as accessing the file management system.

The advantages of APIs are: And the disadvantages:
Open system

The term "open system" emerged in the late 1970s and early 1980s to refer primarily to Unix systems, in contrast to the mainframe computers of the time. Unix systems included standard interfaces for programming and for connection to peripherals, which facilitated hardware and software development by third parties.

The definition of "open system" became more formalized in the 1990s with the rise of the "Single Unix Specification" (SUS) software standards, which set out the requirements for a system to qualify as a Unix system. The SUS are developed and maintained by the Austin Group, based on the earlier work of the IEEE and The Open Group (formerly, X/Open).


De facto standards

Examples of de facto standards are: There are "de facto" standards that have become official (de jure) standards, such as: There are "de facto" standards that have not become official standards, such as:
Objective of the standards

Standards pursue −or should pursue− several specific objectives: Generic objectives are: to achieve simplicity, clarity, efficiency, quality, safety, economy, ease of implementation and to facilitate creativity. We highlight two in particular:
The problems of standards

Standards suffer from problems. Among them are the following:
Open and closed standards

An open standard −also called an "open format"− is a publicly available specification that describes a type or category of elements and is sponsored and maintained by an open standards organization. Anyone can obtain and implement the standard, without legal or economic restrictions. Open systems link manufacturers, vendors, developers and users.

It is common for others to be built on top of some open standards. This is the case of the most common Internet standards, where the XML standard is the foundation of other standards (RDF, XHTML, ODF, OOXML, etc.).

De jure standards are usually open, i.e. their specification has been made public. De facto standards, on the other hand, can be open or closed.

A closed or non-open standard is private or proprietary in nature. In closed standards the specifications are not publicly known and the owners may impose fees or restrictions on usage.

Today, the Web has become the universal open standard. "The decision to make the Web an open system was necessary to make it universal. You can't propose something to be a universal space and at the same time have it under control" (Tim Berners-Lee).


Free software

GNU is a complete operating system free of restrictions on the use of its source code. Its acronym refers to itself: "GNU is not Unix". The GNU project was started by Richard Stallman in 1981, who soon after coined the term "free software" and to promote it created the "Free Software Foundation" (FSF). "The only way to be free is to reject proprietary programs" (Richard Stallman).

In 1984 the first version of the GNU General Public License (General Public License, GPL) was published, aimed at protecting the free distribution, modification and use of software.

Free software is software that, once obtained, can be used, copied, modified and distributed without restriction. It is controlled and managed by the FSF. Free software is generally available free of charge or at the cost of distribution.

Stallman also introduced the concept of "copyleft" to grant freedom (of use, modification and distribution) to users and to restrict the possibilities of software appropriation. The holder of the copyright (copyright) under license copyleft can also make a modified version under his original copyright and sell it under any license he wishes, i.e. to distribute the original version as free software.

Symbols for Copyleft
and Copyright

Under the terms of the GNU General Public License (version 2), Linus Torvalds −when he was 21 years old and a computer science student−developed in C the Linux kernel. GNU/Linux is the combination of the GNU operating system and the Linux kernel, developed with the help of programmers from around the world. Linux was the first free software operating system.

Since "free" means free or free of charge, "free software" should not be confused with free of charge software (which usually includes the source code) or public domain software (which does not require any license).

Apart from Linux, other examples of free software are: StarOffice, X (windowing system) and Open JDK (Java Development Kit).


Open Source

The "Open Source Initiative" (OSI) was born in 1998 to spread the principles of open source, by the hand of some users of the Free Software community, to refer to the fact that the source code of the programs can be modified with previous authorization. But the principles between Free Software and Open Source are equivalent and recognize the same types of licenses.

The main difference between Open Source and Free Software is that the former is based solely on the technical aspects and continuous improvement of the software, and the latter takes into account the ethical and philosophical aspects of freedom.

Some open source programs are:
MENTAL, Universal Standard

The solution approach

Given the diversity of standards, with their corresponding sea of acronyms, it is legitimate to wonder about the possibility of a universal standard, in the same way that the issue of universal language arises with respect to the diversity of particular languages.

Defining a universal standard or "mother" of all standards is of paramount importance. The importance of having a universal standard can be compared to the importance of positional notation in numbers. In a report submitted to the World Bank on September 9, 2005, 13 countries urged nations to adopt open information technology standards as a vital step to accelerate economic growth, efficiency and innovation. Innovation in information technology occurs at a much faster rate than in any other field. Standards accelerate innovation.

Until now it had not been possible to define a universal standard because it was not clear what the fundamentals of computing were. MENTAL clarifies these fundamentals, which are essentially simple: the archetypes of consciousness, which constitute a mental (or conceptual) operating system and the foundation of any physical operating system.

There must be two types of standard:
  1. One deep, semantically based, fixed, unquestionable and universal. It must be simple and structured as a language that contemplates opposites: generic and specific, operational and descriptive, etc.

  2. Several superficial, formal type, based on the deep standard. They are of a generic type and can take many possible forms. They are the equivalents of, for example, XML Schema or RDF Schema. All these surface standards are connected through the deep standard, they are manifestations of this deep standard.
The only standard considered universal is XML, as many other standards, mainly Web standards, are based on it. But XML is not a language per se. It is a syntactic system for representing hierarchical information structures using tags with values, but with many expressive limitations.

XML is currently being applied for the specification of all kinds of information structures. This is the wrong line, since pretending to use XML "for everything" leads to an inconsistency similar to pretending in OOP (Object Oriented Programming) that "everything is an object". It must be taken into account that semantics comes first and then syntax, which must be as simple, readable and appropriate as possible and evoke the associated semantics.


MENTAL, the answer to the standards problem

We have the following features: In short, the adoption of MENTAL as a standard means a return to simplicity, to radical simplicity, to common sense: its impact would be enormous because it is present in all domains, which would accelerate the progress of science, technology and society in general.


XML and MENTAL

XML is a standard that has achieved enormous success because of its simplicity. It has become a universal language (or metalanguage) on which other standards are based. XML is a standard for describing hierarchical information. However, XML is an enormously limited language that is only good for what it was designed for: to tag information, where a start tag and an end tag are specified. For example, a table:
<table>
  <tr><td>row 1</td></tr>
  <tr><td>row 2</td></tr>
</table>
In MENTAL, we could code it as follows, substituting the general form.
<tag>content</tag> 
via
content/tag
where tag is a tag. For example,

(This is a paragraph)/paragraph

We can keep, if we want, the XML syntax by means of the generic expression
⟨( <tag>contenido</tag>
=: contenido/tag )⟩
MENTAL is a metalanguage because you can choose the tags, as in XML.



Bibliography