"Standards should focus on what we call 'the big ideas'" (Christopher T. Cross).
"Only useful standards will survive" (Stephen Fosket).
The Standards
The concept of standard
The term "standard" −in English, "standard"− is given to something that is considered the model, norm, pattern or reference to be followed in a certain domain. The properties associated with standards are:
Uniqueness.
The standard is supposed to be unique or tend towards uniqueness, although there may be periods in which several standards coexist. But it is precisely uniqueness that confers its character as a standard. Everything that is not standard is a deviation from the established.
Compatibility.
Different systems following the same standard are compatible with each other.
Clarity.
A standard is supposed to clarify the domain by being the reference. Usually a domain-specific language is used, which implies syntax and semantics.
Value.
A standard is all the more important and valuable the greater its domain of application.
The importance of standards
Standards allow the world to function consistently and more efficiently. If standards did not exist, everything would most likely be in chaos, a total mess. Standards encourage progress in society, science and industry.
Standards are everywhere and we benefit from them without hardly realizing it. Prominent examples are:
Road signs, which are the same all over the world.
The international system of units. Heir to the old metric decimal system, it is the system of units used in almost all countries. It allows the exchange of scientific, cultural, commercial, etc. data. Previously, each country −sometimes even each region− had its own system of units.
It includes 7 basic physical units: meter (length), kilogram (mass), second (time), ampere (intensity of electric current), kelvin (temperature), candela (luminous intensity) and mole (quantity of substance). Of the basic units there are multiples and submultiples of 10, which are expressed by prefixes (such as deca, hecto, kilo, mili, giga, tera, etc.). Previously, units were divided into 3, 12, 16, ... parts, which made arithmetic operations difficult. Lavoisier went so far as to state that "Nothing greater or more sublime has come from the hands of man than the decimal metric system".
Positional numeric coding, a standard considered the greatest achievement of civilization.
Types of standards
We can distinguish the following types:
“De facto”.
It is a classic or traditional system that has been imposed over time or because it is the most widespread or most accepted system at the popular or market level. An example of a de facto standard is the QWERTY system of computer keyboards, heir to the old typewriters, in which the positions of the keys were oriented to avoid jamming.
“De jure”.
It is an official, legally binding standard, sponsored, approved, and maintained by a standardization organization.
Elected.
A standard chosen by consensus among different alternative systems, as being the system that best fits certain requirements, mental model or physical reality.
Triumphant.
As a result of the so-called "standards wars," also called "format wars," a competition between mutually incompatible proprietary formats competing for the same market. Standards wars are characteristic of the information age. There are wars between operating systems, languages, browsers, hardware platforms, architectures, etc., from different manufacturers, each trying to impose its solution as a standard.
In these wars, there is usually a positive feedback effect: the strong one gets stronger and the weak one gets weaker. When the stronger one reaches a certain critical mass, growth is unstoppable and in the end it dominates the market. There are many examples of standards wars:
Perhaps the paradigmatic example was the 1980s home video recorder (VCR) war between VHS (from JVC) and Betamax (from Sony), a war that VHS won. The triumph of one technology over the other is not necessarily determined by the quality of the technology, but by the capacity for dissemination. VHS triumphed, despite being technically worse than Betamax, because JVC provided (licensed) its technology to other manufacturers. Meanwhile, Sony tried to disseminate its Betamax technology on its own.
One historical war was the so-called "war of the currents" between direct current (championed by Edison) and alternating current (championed by Tesla), a war that Tesla won.
The war between Blu-Ray (Sony's) and HD-DVD (Toshiba's), which Sony won.
The war of the operating systems: Microsoft Windows versus Apple Macintosh, a war won by Microsoft, despite the fact that Apple's operating system is simpler and of higher quality. Windows has 30 million lines of code (some say it has twice as many) and Apple's has only 2.5 million. Apple always strives for maximum simplicity in its products.
The initial browser war, between Microsoft Explorer and Netscape, won by Microsoft. Subsequently, the war between Microsoft Explorer and Google Chrome, won by Google.
The war between Microsoft Word and WordPerfect, won by Microsoft.
The war between Microsoft Excel and Lotus 1-2-3, won by Microsoft.
There are currently several wars in play, We can highlight the war between mobile operating systems between iOS (from Apple) and Android (from Google). The latter is led by a group of companies called "Open Handset Alliance" (OHA).
The Standards in Math
There are no official standards in mathematics because this discipline is considered an open language in which any abstract concept can be formally expressed as long as its meaning is explained. Nevertheless:
The positional coding of numbers can be considered as a de facto standard, because of its universal acceptance.
Consolidated mathematical concepts −besides numbers− such as sets, sequences, vectors, matrices, etc., with their corresponding formalism and their operations, can be considered as standards.
ZFC (Zermelo-Fraenkel, with the axiom of choice) set theory is labeled as "standard" to indicate that it is the most widespread, accepted and fundamental theory. There are alternative or non-standard set theories.
The decimal system of numerical representation
The oldest known positional system in history is the sexagesimal (base 60) or Babylonian system (circa 1800 BC). The current system of numerical representation based on powers of 10 is considered one of the greatest advances of mankind throughout its history.
Behind this system is the principle of conceptual reflection, that is, a concept that applies to itself, (concept of concept), which is an essential mechanism of consciousness. Starting from the unit (concept of order 0), we have 10 units (concept of order 1), 10 units of 10 units (concept of order 2), and so on. Every time we refer to a number, we are tuning (consciously or unconsciously) to the universal mechanism of conceptual reflexivity.
The positional principle gave rise to the concept of "zero", without which there would be neither the concept of negative number nor imaginary numbers. However, zero was already discovered by the Mayan civilization some 600 years before its appearance in India, and they represented it with a shell.
The base-10 positional numbering system was invented in India around 450 AD. Because it was spread by the Arabs, it is called the "Indo-Arabic system". In Europe it was introduced around the year 1000 AD, although due to some resistance it took several centuries before its widespread use. The resistance came from the accountants, who did not want its adoption to upset the clergy, who considered the system to be "diabolical" because of its Arabic origin.
The Italian Fibonacci (Leonardo of Pisa) was the main popularizer of the positional system. He spent a long period in Bukhia, near Algiers, where his father was a customs clerk. There he learned the numbering system and became aware of its great advantages, mainly for simplifying calculations. Back in Italy, he published in 1202 the treatise "Liber Abaci" (Book of the Abacus).
A curious fact is that Fibonnacci kept the order of the digits according to the Arabic script, that is, from right to left. It is also curious that Fibonacci called "Book of the Abacus" to a book that tried to put an end to that instrument of manual calculation. He did so to disguise its contents, because he knew that the powerful abacus-calculists of the time would oppose it. And, in fact, when its contents became known, it caused uneasiness among them, because they saw that their trade was in danger.
European Christians and Catholic authorities were so attached to their archaic systems, as well as the resistance of professional calculators (mostly clerics) to novelties, that it was necessary to wait centuries for its implementation and dissemination. The Church did nothing to favor the democratization of calculus because this implied the loss of its monopoly on teaching and the consequent diminution of its power and influence.
Dispute between an abacist and an algorist (Margarita Philosophica, 1508).
Fibonacci's book did not manage to spread fully in Europe until the end of the 16th century for three reasons: 1) The book was not "official", as it was conceived outside academic circles; 2) It encountered strong opposition because of the sense of the writing of the digits, until it was finally accepted; 3) The opposition of calculators.
The new system replaced the Roman system of representation used in Europe at that time. The Roman system did not allow for complex operations. It was easy to add and subtract, but for multiplication and division it was necessary to resort to professional calculator-abacusts. The Roman numerals (I, V, X, L, C, D and M) had absolute value, a value by themselves, and there was no zero.
The invention and democratization of positional numeration has had incalculable consequences in human society, facilitating the development of science and technology, and has allowed the mechanization of arithmetic and mathematical calculation in general.
There has been no "Tower of Babel" of numbers. While there are more than 4,000 languages and several dozen alphabets and writing systems, there is currently only a single system of numerical representation. It is a kind of numerical Esperanto.
Non-standard set theories
There have been alternative theories to the standard ZFC of set theory. The most prominent of these are:
Internal Set Theory, IST).
It is a theory developed by Edward Nelson [1977]. Instead of adding new mathematical entities, it includes a predicate called "standard", which is applicable to any mathematical object. This predicate allows us to make distinctions that are not possible under the conventional axioms of set theory. So for every mathematical object we have two versions: the ordinary and the standard.
A formula of ordinary mathematics −a formula that does not employ the predicate "standard", either directly or indirectly− is qualified as "internal". Otherwise, the formula qualifies as "external".
All ZFC axioms are satisfied for classical mathematical objects of set theory, but 3 new axioms are added for the new predicate, named precisely I, S and T (the initials of "Internal Set Theory").
This theory does not change ordinary mathematics, but extends and enriches it. In particular, it axiomatizes a part of Robinson's nonstandard analysis, a theory that provided a certain foundation for infinitesimal calculus. The theory is less complex than the non-standard analysis, since it does not need the complexities of mathematical logic that had to be introduced to support the consistency of the different infinitesimal entities.
Positive Set Theory, PST).
The axiom of understanding of ZFC theory states that for every property P there exists a set of elements having that property and denoted as {x | P}. Positive set theory states that the set of elements having no property exists: {x | ∅}, where ∅ is the empty property.
Positive set theory was introduced by Isaac Malitz in his PhD thesis at UCLA.
Semisets theory).
A semiset is a proper class contained in a set C, i.e. it is neither C nor ∅ (the empty set). This theory generalizes to set theory. It was proposed by Petr Vopěnka and Petr Hájek [1972].
The "New Foundations" theory.
It is an axiomatic theory conceived by Quine as a simplification of the type theory of Russell and Whitehead's Principia Mathematica. Quine proposed this theory in a 1937 paper entitled "New Foundations for Mathematical Logic", hence its name.
The Standards in Computer Science
In computing, a standard is defined as a set of standardized rules that describe the requirements that a product or service must meet, with the objective of establishing a base mechanism so that different elements (hardware or software) that use it are compatible with each other.
The problem of standards in communicating information is as old as civilization itself. The fact that the Internet has become a digital platform accessed and interacted with by millions of people makes Internet standards of decisive importance. The web has become the global space and the global operating system, and standardization has turned to the web because of its importance and significance. Since 1996, more than 100 standards have been created.
A developing technology with a great future is Cloud Computing (computing in the cloud). It is a new paradigm that makes it possible to offer computing services over the Internet. The cloud is the metaphor for the Internet. But two problems are limiting its adoption: lack of interoperability and security risks. "Now you can't exchange things between clouds and that has to change" (Vinton Cerf). Vinton Cerf is considered one of the "fathers" of the Internet and inventor of the TCP/IP protocol (Transmission Control Protocol/Internet Protocol).
Today, the goal is to raise the semantic level: to move from information to knowledge, from syntax (pure form) to semantics (meaning). For years, the so-called "Semantic Web" has been pursued: with semantic relations, semantic searches, use of ontologies (set of concepts and relations between them), with the aim of achieving semantic interoperability.
A standard codification for the representation of all types of data, information and knowledge, as well as their meta levels, has not been achieved so far, starting with the most basic, which is the lack of a standard operating system. Achieving a universal codification for this, as was done with numbers, would represent an enormous conceptual and evolutionary leap for mankind.
Interfaz
In general, an interface is a formal system that facilitates and simplifies communication between a user and a system, or between two systems (or between two components of a system).
In computer science, an API (Application Programming Interface) is the application programming interface: a set or library of functions (in functional programming), procedures (in imperative programming) or methods (in object-oriented programming) that constitute an abstraction layer to be used by other software for application development. Almost all programs rely on the APIs of the underlying operating system to perform basic functions, such as accessing the file management system.
The advantages of APIs are:
The internal or underlying complexity, i.e. the system code, is hidden, avoiding having to program the detail, facilitating a higher level of abstraction and simplifying programming.
The interface is independent of the system implementation.
And the disadvantages:
The APIs only offer some functionalities, the most important or essential ones, but not all the possible ones. To get all the possibilities you would have to access the code. The goal is to strike a balance between simplicity of use and functionality.
To use APIs they must be well documented.
Open system
The term "open system" emerged in the late 1970s and early 1980s to refer primarily to Unix systems, in contrast to the mainframe computers of the time. Unix systems included standard interfaces for programming and for connection to peripherals, which facilitated hardware and software development by third parties.
The definition of "open system" became more formalized in the 1990s with the rise of the "Single Unix Specification" (SUS) software standards, which set out the requirements for a system to qualify as a Unix system. The SUS are developed and maintained by the Austin Group, based on the earlier work of the IEEE and The Open Group (formerly, X/Open).
De facto standards
Examples of de facto standards are:
Microsoft's Windows operating system.
Microsoft's .NET platform. It is a platform for developing and running Web applications.
Oracle's Java programming language.
Oracle's JavaEE platform. It is a programming platform for developing and running application software in the Java language.
OpenMP (Open Multiprocessor). It is the de facto standard for parallel programming on shared memory systems.
There are "de facto" standards that have become official (de jure) standards, such as:
MP3 (digital sound format).
HTML (web page format).
PDF. Created in 1993 by Adobe, it became an ISO standard in 2005 as PDF/A (archival format) and PDF/X (graphic interchange format).
There are "de facto" standards that have not become official standards, such as:
DXF (Data Exchange Format), the AutoCAD format.
DOC, the Microsoft Word format.
XLS, the Microsoft Excel format.
PPT, the Microsoft PowerPoint format.
PHP, a general-purpose server-side programming language for developing dynamic content web applications. It competes with Microsoft's Active Server Pages (ASP) technology.
Objective of the standards
Standards pursue −or should pursue− several specific objectives:
Interoperability.
It is the ability for two systems made with different technologies to communicate and interact, regardless of hardware and operating system.
Portability.
It is the ability of a software to run on different hardware platforms and operating systems.
Scalability.
It is the ability for a software to run regardless of the scale of the platform, from a personal computer to a mainframe. The Unix operating system meets this goal.
Componentization.
As with other fields, computer systems are built by assembling components that have already been made. This reduces complexity and development effort. A single component can be part of several systems.
Buildability.
The ability of a standard to serve as a basis for building other standards.
Generic objectives are: to achieve simplicity, clarity, efficiency, quality, safety, economy, ease of implementation and to facilitate creativity. We highlight two in particular:
Simplicity.
Standards seek maximum simplification, considering only what is necessary. This principle is often not fulfilled.
Clarity.
The specification of the standard should be done using clear and precise language. A well-defined standard can accelerate the adoption of a new technology. For example, DVD was a technology that was adopted quickly because it was clearly defined.
The problems of standards
Standards suffer from problems. Among them are the following:
Diversity of standards.
In computing there is a huge number of standards, a veritable tower of Babel. There are standards for everything: office documents, graphics, web pages, databases, geographic information systems, etc. Moreover, these standards are always evolving, changing as concepts, ideas and paradigms evolve. It can be said that there are no established and fixed standards. Given this diversity, it is very difficult or impossible to have a global vision from which to orient oneself.
Diversity of standards organizations.
There are many international organizations in charge of producing standards, specialized organizations and, in principle, not coordinated with each other. There is no "mother" or umbrella organization for all standards.
Complexity.
Some standards are complex to use, making them unproductive. A paradigmatic example is the CORBA standard, the standard for distributed objects, which was never fully implemented because of its enormous complexity. And what is complex is that it is poorly conceived.
There is the paradox that some standards are so complex that other higher order standards are needed to create, modify or access those complex formats. This is the case of RDF, RDFa, GRDDL and Powder.
Paradigms.
For each domain, there may be different paradigms, which implies creating corresponding standards. Examples of paradigms are: the object paradigm, the functional paradigm, the client/server paradigm, the web paradigm, etc.
Linguistics.
Different standards are usually disconnected from each other, are superficial, have no common foundation or root, no common linguistics, use different languages or coding systems.
Incompleteness.
Standards are never complete. New standards always have to be created to address new situations or needs. For example, there is a Rule Markup Initiative.
Mental model.
The standards are not humanistic, i.e., they are not close to a model of the human mind or way of thinking.
Semantics.
Standards are formal structures, without a precise and clear semantic foundation.
Objectives.
Some standards do not have clearly defined objectives.
Level of detail.
There are strong controversies about the level of detail that standards should have.
Open and closed standards
An open standard −also called an "open format"− is a publicly available specification that describes a type or category of elements and is sponsored and maintained by an open standards organization. Anyone can obtain and implement the standard, without legal or economic restrictions. Open systems link manufacturers, vendors, developers and users.
It is common for others to be built on top of some open standards. This is the case of the most common Internet standards, where the XML standard is the foundation of other standards (RDF, XHTML, ODF, OOXML, etc.).
De jure standards are usually open, i.e. their specification has been made public. De facto standards, on the other hand, can be open or closed.
A closed or non-open standard is private or proprietary in nature. In closed standards the specifications are not publicly known and the owners may impose fees or restrictions on usage.
Today, the Web has become the universal open standard. "The decision to make the Web an open system was necessary to make it universal. You can't propose something to be a universal space and at the same time have it under control" (Tim Berners-Lee).
Free software
GNU is a complete operating system free of restrictions on the use of its source code. Its acronym refers to itself: "GNU is not Unix". The GNU project was started by Richard Stallman in 1981, who soon after coined the term "free software" and to promote it created the "Free Software Foundation" (FSF). "The only way to be free is to reject proprietary programs" (Richard Stallman).
In 1984 the first version of the GNU General Public License (General Public License, GPL) was published, aimed at protecting the free distribution, modification and use of software.
Free software is software that, once obtained, can be used, copied, modified and distributed without restriction. It is controlled and managed by the FSF. Free software is generally available free of charge or at the cost of distribution.
Stallman also introduced the concept of "copyleft" to grant freedom (of use, modification and distribution) to users and to restrict the possibilities of software appropriation. The holder of the copyright (copyright) under license copyleft can also make a modified version under his original copyright and sell it under any license he wishes, i.e. to distribute the original version as free software.
Symbols for Copyleft and Copyright
Under the terms of the GNU General Public License (version 2), Linus Torvalds −when he was 21 years old and a computer science student−developed in C the Linux kernel. GNU/Linux is the combination of the GNU operating system and the Linux kernel, developed with the help of programmers from around the world. Linux was the first free software operating system.
Since "free" means free or free of charge, "free software" should not be confused with free of charge software (which usually includes the source code) or public domain software (which does not require any license).
Apart from Linux, other examples of free software are: StarOffice, X (windowing system) and Open JDK (Java Development Kit).
Open Source
The "Open Source Initiative" (OSI) was born in 1998 to spread the principles of open source, by the hand of some users of the Free Software community, to refer to the fact that the source code of the programs can be modified with previous authorization. But the principles between Free Software and Open Source are equivalent and recognize the same types of licenses.
The main difference between Open Source and Free Software is that the former is based solely on the technical aspects and continuous improvement of the software, and the latter takes into account the ethical and philosophical aspects of freedom.
Some open source programs are:
The Ubuntu and Debian operating systems, based on Linux.
Google's Android operating system for Smartphones and Tablets.
OpenOffice, office suite.
Firefox, web browser.
MENTAL, Universal Standard
The solution approach
Given the diversity of standards, with their corresponding sea of acronyms, it is legitimate to wonder about the possibility of a universal standard, in the same way that the issue of universal language arises with respect to the diversity of particular languages.
Defining a universal standard or "mother" of all standards is of paramount importance. The importance of having a universal standard can be compared to the importance of positional notation in numbers. In a report submitted to the World Bank on September 9, 2005, 13 countries urged nations to adopt open information technology standards as a vital step to accelerate economic growth, efficiency and innovation. Innovation in information technology occurs at a much faster rate than in any other field. Standards accelerate innovation.
Until now it had not been possible to define a universal standard because it was not clear what the fundamentals of computing were. MENTAL clarifies these fundamentals, which are essentially simple: the archetypes of consciousness, which constitute a mental (or conceptual) operating system and the foundation of any physical operating system.
There must be two types of standard:
One deep, semantically based, fixed, unquestionable and universal. It must be simple and structured as a language that contemplates opposites: generic and specific, operational and descriptive, etc.
Several superficial, formal type, based on the deep standard. They are of a generic type and can take many possible forms. They are the equivalents of, for example, XML Schema or RDF Schema. All these surface standards are connected through the deep standard, they are manifestations of this deep standard.
The only standard considered universal is XML, as many other standards, mainly Web standards, are based on it. But XML is not a language per se. It is a syntactic system for representing hierarchical information structures using tags with values, but with many expressive limitations.
XML is currently being applied for the specification of all kinds of information structures. This is the wrong line, since pretending to use XML "for everything" leads to an inconsistency similar to pretending in OOP (Object Oriented Programming) that "everything is an object". It must be taken into account that semantics comes first and then syntax, which must be as simple, readable and appropriate as possible and evoke the associated semantics.
MENTAL, the answer to the standards problem
We have the following features:
Universal standard.
The primary or fundamental goal of an open system must be a universal operating system, a universal language and a universal software architecture. MENTAL meets these requirements: it is the foundation of every operating system, it is a universal language, and it provides a universal software architecture. As an operating system, it is necessary to implement only the semantic primitives. The rest can be developed in MENTAL itself.
MENTAL is a standard for data and processes. Data can include code and code can contain data. It is also a standard for defining all kinds of information structures: functions, processes, events, objects, rules, agents, graphs, databases, etc. With MENTAL, boundaries are blurred. MENTAL is a universal model or meta-model for modeling all types of information structures.
Standards should be like the Magna Carta: a set of general principles, big ideas or general concepts that underpin everything else. Standards should provide a platform of freedom. In this sense, standards should be the kernel of everything else.
Universal open system.
An open system must start with the most fundamental, which is the operating system. The standard must be the fundamental. MENTAL is a universal open system that underlies every operating system and on which all particular operating systems can be built. Derived standards can be created from this universal standard, but all of them connected through a common root. As in MENTAL the source code is equal to the object code, the code is not hidden, it is always visible, it is open.
MENTAL is a totally open system. Not only the syntax and semantics of the standard are specified, but also the code, its abstract implementation, is available.
MENTAL is not just a language, but the foundation of every operating system, since it is based on the archetypes of consciousness. "Consciousness itself is the operating system and reality is the output" (Gregg Braden). It connects inner world and outer world.
"Mother" standard.
MENTAL is the "mother" standard or foundation of all particular standards. The situation is analogous to that of programming languages and knowledge representation systems.
In the case of the rule marking initiative, the solution is very simple. It is enough to define the rules, assign marks to the rules and activate the rules according to some conditional criterion:
Language.
With MENTAL, a standard is a category of expressions. It can be specified by a parameterized generic expression. And, therefore, it belongs to the language itself. It is a manifestation of the language.
At the operational level, you can write the detail of each function as a parameterized generic expression. The function uses abstract resources, which become concrete in a given execution system. If the parameter names are mnemonic, the specification would be self-sufficient. For example:
At the descriptive level, one can write for example:
〈( Order(code quantity) )〉
MENTAL is not only standard at the technical, specialist level, but a democratic and popular language, as it is a model of the mind.
Metalanguage.
MENTAL is also a metalanguage, since it allows to define particular languages.
Interface.
MENTAL is a universal interface, the means by which a user communicates with a system or two systems communicate with each other. It is a neutral language of the mental type.
Abstraction.
MENTAL is a language that provides a layer of abstraction regardless of the implementation.
Reusability.
All components (of any type) are reusable. The components only have to be developed once and are available to the users, who can, if they wish, introduce modifications.
APIs.
The APIs of the operating system are the primitives of MENTAL.
Simplification.
As MENTAL everything is simplified, everything is simpler. Developments are simplified and standards are simplified as well.
Secondary semantics.
MENTAL provides primary semantics, which are forms without content, i.e. all standards are based on the same encoding, the same common language, which is essentially simple. A secondary semantics based on names is additionally needed, which allows us to identify the elements we are encoding. These names should be literal or mnemonic (such as Price, Quantity, Code, Description, etc.) for ease of use.
Standards are easily created because a simple, easy to apply and powerful language is available. All that needs to be decided is the structure and nomenclature of the component elements.
Communication.
Communication between elements is established indirectly through the environment, the common space in which the expressions reside. It is the simplest communication system.
Distributed systems.
A server can be considered an expression container. Therefore, to access the expression x of the server S we can use the expression Sx. Virtual servers can also be created.
Unification.
With MENTAL, everything is unified on a theoretical and practical level: computer science, mathematics, programming language, operating system, traditional web, semantic web, cloud computing, etc. To this we must add the standards. With MENTAL, everything is achieved: open software, interoperability, portability and scalability, open architecture, services, components, etc.
In short, the adoption of MENTAL as a standard means a return to simplicity, to radical simplicity, to common sense: its impact would be enormous because it is present in all domains, which would accelerate the progress of science, technology and society in general.
XML and MENTAL
XML is a standard that has achieved enormous success because of its simplicity. It has become a universal language (or metalanguage) on which other standards are based. XML is a standard for describing hierarchical information. However, XML is an enormously limited language that is only good for what it was designed for: to tag information, where a start tag and an end tag are specified. For example, a table:
Grindly, Peter. Standards Strategy and Policy. Cases and stories. Oxford University Press (USA), 1995.
Gutiérrez Martínez, José María; Palacios Escribano, Fernando; Gutiérrez de Mesa, José Antonio. Estándar XML y tecnologías asociadas. Danysoft Internacional, 2003.
Haffner, Kimberly A. Semantic Web: Standards, Tools and Ontologies. Nova Science Pub,, 2010.
Nelson, Edward. Internal set theory: A new approach to nonstandard analysis. Bulletin American Mathematical Society 83, pp. 1165-1198, 1977.
Stango, Victor. The economics of standars wars. Federal Reserve Bank of Chicago. Review of Network Economics, 3(1), pp. 1-19, March 2004.
Stott, David; Moran, Diane. Information and Communication: ECDL - the European PC standard (European Computer Driving Licence). Springer, 2001.
Varian, Hal R.; Shapiro, Carl. El dominio de la información: Una guía estratégica para la economía de la red. Antoni Bosch editor, 2000.
Varian, Hal R.; Shapiro, Carl. The Art of Standars Wars. California Management Review, 41(2), pp. 8-32, Winter 1999. Disponible en Internet.
Vopěnka, Petr; Hájek, Petr. The Theory of Semisets. North Holland, 1972.
Williams, Sam. Free as in Freedom: Richard Stallman’s crusade for Free Software. O’Reilly, 2002.
Zeldman, Jeffrey. Diseño con estándares Web. Anaya Multimedia-Anaya Interactiva, 2003.