"Generalization is an important way to make progress in mathematics. What matters is what is conceivable, not what corresponds directly
to physical experience" (James Joseph Sylvester).
"Cannot music be described as the mathematics of the senses and mathematics as the music of reason?" (James Joseph Sylvester)
Main Concepts
A matrix is a rectangular structure (table) consisting of rows and columns, whose cells contain real numbers. This table is enclosed between a pair of large parentheses. The notation for a table of n rows and m columns is
A=
(
a11
...
a1n
)
....
...
...
an1
...
ann
The table is denoted by a capital letter, and the contents in row i and column j by aij.
For example, a matrix of 2 rows and 4 columns:
A=
(
3 2 7 4 5 9 6 1
)
The elements are: a13=7, a24=1, etc.
A square matrix is one that has the same number of rows as columns. This number is called the "order" of the square matrix.
Types of matrices and operations with matrices
Sum of matrices: The sum of the matrices A and B is another matrix C in which each element is the sum of the elements occupying the same position: cij = aij + bij. Analogously for matrix subtraction.
Properties:
Commutative: A+B ≡ B+A
Associative: (A+B)+C = A+(B+C)
Null matrix O (the neutral matrix of the sum, with all elements at zero): A+O = A
Product of a scalar: The product of a scalar r by a matrix A is another matrix C in which each element is the product of the scalar by each element of A: cij = r*aij. Division by a scalar r is multiplication by the inverse of that scalar (r−1).
Properties:
Associative: (r*s)*A) = (r*(s*A))
Distributive: (r + s)*A = r*A + s*A
Opposite matrix: −A = (−1)*A
A−A = O
Product of matrices: The product of the matrices A and B is another matrix C in which each element is the sum of the products of the corresponding elements of the rows of A and the columns of B:
cij = ∑aikbkj (summation from k=1 to m)
Properties:
Noncommutative: A*B ≠ B*A
Associative: A*(B*C) = (A*B)*C)
Distributive: (A+B)*C = A*C + B*C
Transpose matrix: The transpose matrix of a matrix A is the same matrix in which rows have been exchanged for columns. If the matrix A is n rows and m columns, the matrix transpose is m rows and n columns. The notation is At (transpose of A).
Properties:
(At)t = A
(A + B)t = At + Bt
(A * B)t = Bt * At
Diagonal matrix: A square matrix that has only elements on the main diagonal, where the subscripts are equal: (a11, a22, ... , ann).
Unit matrix (I): It is a diagonal matrix with all its elements at 1: aij = 1 if i=j and 0 if i≠j.
Inverse matrix (A−1) of a matrix A: is the one that satisfies A*A−1 = A−1*A = I
Symmetric matrix: It is the one that is equal to its transpose. At = A
Antisymmetric matrix: It is the one whose transpose is equal to its opposite. At = −A
Orthogonal matrix: It is the one whose transpose is equal to its inverse. At = A−1
Involutive matrix: It is the one that is equal to its inverse. A = A−1
Idempotent matrix: It is the one whose square is equal to itself. A2 = A (where A2 = A*A)
Nilpotent matrix: It is the one whose square is equal to the null matrix. A2 = O
Least complementary of the element aij of a square matrix A is the determinant Mij of the matrix resulting from suppressing the row i and the column j of A.
The adjoint of an element aij of a square matrix is: Aij = (−1)i+j * Mij
Adjoint matrix of a square matrix A: It is the matrix Aa formed by the adjuncts of all the elements of A.
The determinant |A| of a square matrix A| is the sum of the product of any row (row or column) by its corresponding adjuncts. For example, using row 1,
|A|= ∑a1jA1j (sum from j=1 to n)
Properties:
A−1 * A = A * A−1 = I
(A * B)−1 = B−1 * A−1
(A−1)−1 = A
(r*A)−1 = r−1 * A−1
(At)−1 = (A−1)t
|A−1| = |A|−1
By definition, the adjoint matrix of a square matrix of order 1, A = (a11), is (1), the unit matrix of order 1. Or, which is the same, the adjoint of a11 is 1. Its transpose is itself. And its determinant is a11.
Calculation of the inverse matrix A−1of a matrix A:
A−1 = (Aa)t/|A|
Examples:
One-dimensional matrix.
A = (a11)
Aa = (Aa)t = (a11)
|A| = a11 A−1 = (1 ÷ a11)
A*A−1 = (1)
Two-dimensional matrix.
A =
(
a11
a12
)
a21
a22
Aa =
(
a22
−a21
)
−a12
a11
(Aa)t =
(
a22
−a12
)
−a21
a11
|A| = a11A11 + a12A12= a11a22 − a12a21
A−1=
(
a22÷|A|
− a12÷|A|
)
−a21÷|A|
a11÷|A|
A * A−1=
(
1
0
)
0
1
Three-dimensional matrix.
A=
(
a11
a12
a13
)
a21
a22
a23
a31
a32
a33
A11=
|
a22
a23
|
= a22a33 − a23a32
a32
a33
A12=−
|
a21
a23
|
= a21a33
− a23a31
a31
a33
A13=
|
a21
a22
|
= a22a32
− a22a31
a31
a32
|A| = a11A11 + a 12A12 + a13A13
Aa=
(
A11
A12
A13
)
A21
A22
A23
A31
A32
A33
(Aa)t=
(
A11
A21
A31
)
A12
A22
A32
A13
A23
A33
A−1 = (Aa)t/|A|
Specification and Generalization in MENTAL
A one-dimensional matrix is a sequence. A two-dimensional matrix is represented by a sequence of sequences of the same length. A three-dimensional matrix would be a sequence of sequences of sequences of the same length. And so on. There can be as many levels as you want.
An example of a two-dimensional matrix is:
( (3 2 7 4) (5 9 6 1) )
(matrix 2×4: 2 sequences of length 4).
The following matrix is three-dimensional: 4×3×2: 4 sequences of 3 sequences of length 2:
Column j is.
([x\[1…n]\j]),
where n is the number of rows of x.
An element of an array x of n dimensions is
x, where i1 ... in are the indexes corresponding to each of the dimensions.
The first dimension has length x#
The second dimension is of length (x\1)#
The third dimension has length (x\1\1)#
The fourth dimension has length (x\1\1\1)#
etc.
Operations with matrices
For the operations with arrays we can use the notation that we want except using the arithmetic operators, because these already have a defined semantics. Here we are going to use the arithmetic symbol together with the qualifier "m" to indicate that it is a matrix operation. Temporary variables are indicated by ending in "x".
Null matrix of n rows and m columns:
〈( O(nm) = (0★m)★n )〉
Sum of the matrices A and B (two-dimensional):
〈( (A +/m B) = ((C = A) ⌊Cx↓↓⌋ = ⌊A↓↓⌋ + ⌊B↓↓⌋] Cx)! )〉
Properties:
Commutative:
〈( (A +/m B) ≡ (B +/m A) )〉
Associative:
〈( (A +/m B) +/m C)) ≡ (A +/m (B +/m C)) )〉
Null matrix;
〈( (A +/m O) = A )〉
Multiplication by a scalar:
〈( (r */m A) = ((Cx = A) [[Cx↓↓] = r*[A↓↓]]] Cx)! )〉
Properties:
Associative:
〈( ((r*s) */m A) = ((r */m (s */m A) )〉
Distributive:
〈( ((r+s) */m A) = ((r */m A) +/m (s */m A) )〉
〈( det(A) = +⊣([((a\1\j)*adj(A i j))/(j=[1...n])]) )〉
Inverse matrix (Inv) of matrix A:
〈( Inv(A) = (T((Adj(A)) ÷ det(A)) )〉
Properties:
((Inv(A) */m A) ≡ (A */m Inv(A)) = I)
(Inv(A */m B) = (Inv(B) */m Inv(A)))
(Inv(Inv(A))) = A)
(Inv(r */m A) = (1÷r * Inv(A)))
(Inv(T(A)) = T(Inv(A)))
(det(Inv(A)) = 1÷det(A))
Beyond traditional matrices
A matrix can be considered the generalization of a vector (a vector of vectors) or higher order vector, just as vector can be considered the generalization of number. A 2-dimensional matrix would be a vector of order 2.
But the definition of matrix is limited in several ways:
The elements are numbers. With MENTAL the elements can be of any type: variables, functions, rules, sequences, etc. Even matrices, i.e. there can be higher order matrices (matrices of matrices).
The structure is always two-dimensional. With MENTAL the matrices can be of any number of dimensions.
You cannot establish relationships between the elements of a matrix, or between the elements of different matrices. For example, that an element is common to several matrices, that an element is a function of one or more others, etc.
A matrix can also be considered as a function of positive integer variables. For example, the matrix
A=
(
1
5
)
7
−2
can be interpreted as the (extensively defined) function
A(1 1)=1 A(1 2)=5
A(2 1)=7 A(2 2 2)=−2
With MENTAL this can be generalized to any number of arguments and of any type.
Addenda
History
Matrix, in its generalized sense, is an entity that contains the germ of something or is the origin of something.
Matrix algebra has two main protagonists, Arthur Cayle and James Joseph Silvester, who were close friends, and both inspired each other in mathematical subjects. They were of opposite characters. Cayle was quiet. Sylvester was restless and temperamental. In mathematics, Cayle was formal and rational. Sylvester was emotional. "Cannot music be defined as the mathematics of the senses and mathematics as the music of reason?"(Sylvester).
Cayle was the originator of matrix algebra by introducing operations between matrices (addition, subtraction, product), the inverse matrix, the null matrix and the identity matrix. Syvester made important contributions in matrix algebra, number theory and combinatorics.
The term "matrix" (plural matrices) was coined by Sylvester in 1850. This author also coined the terms invariant, discriminant and totient.
An invariant is a mathematical entity that does not change when a set of transformations is applied to it. This property is called "invariance" or "invariance". A simple example is the distance between two points on the real line, which does not change when adding the same number to the two points. The same is not true when multiplying them by a real number. The concept of invariance is of great importance in modern physics, especially in relativity theory.
The discriminant of a polynomial is a certain expression of the coefficients of that polynomial that is equal to zero if and only if the polynomial has multiple roots in the complex plane. For example, the discriminant of the polynomial ax2 + bx + c = 0 is b2−4ac.
The totient of a natural number n is the number of positive integers that are prime relative to n.
Cayley introduced matrix notation in 1855 as an abbreviated and compact way of representing a system of n linear equations with m unknowns. This formalism arose from geometric linear transformations in two-dimensional Cartesian space of the type
u = ax + by v = cx + dy
with a, b, c, d constants. Cayley represented the vectors (x, y) and (u, v) vertically (column vectors) and the coefficients by a 2×2 table, and wrote it thus:
(
u v
)
=
(
a b c d
)
(
x y
)
Cayle thought that this system could be extended to represent linear transformations in a hyperspace, a space of n dimensions. Matrices as a means to generalize three-dimensional space, and matrix algebra to make computation possible in n dimensional space.
Cayle thought that his system was no more than a compact form of representation and was unlikely to ever have practical application. He was wrong. He failed to glimpse its full potential. Today matrix algebra is indispensable in mathematics and in many other domains: computer science, engineering, physics, statistics, economics, etc.
Cayle's conception of higher-dimensional space encountered opposition from many mathematicians of his time, such as Clement Ingleb, who argued (following Kant) that space is essentially three-dimensional and that there was no sense in a space higher than three dimensions.
Sylvester defended Cayle's idea of generalized space of n dimensions because it was perfectly conceivable and that actual physical space is irrelevant to mathematical questions.
The formal axiomatic conception of mathematics is that something exists if it is not logically contradictory. In MENTAL, a mathematical entity exists if it can be constructed in a finite number of steps or described by a finite expression. Language is thus freed from the true/false duality. The true dimensions are the degrees of freedom of the primitives with which interrelated expressions are created in abstract space. The concept of dimension of a system as its number of degrees of freedom was established by Hamilton in 1835, a theme that was initiated by Lagrange in his Analytic Mechanics of 1788.
The 20th century saw a veritable explosion of applications of matrix algebra, especially in physics (e.g., Einstein's theory of relativity and Heisenberg's matrix quantum mechanics). Today, the natural tendency of mathematicians is to generalize, formulating everything in n dimensions from the beginning, where matrices arise.
Matrices have many applications, but they are especially useful for representing the coefficients of a system of linear equations and for linear transformations in general. For example, geometric transformations (scaling, rotation, reflection and translation). Matrix theory is considered a branch of linear algebra.
Cayley's most famous work is the theory of algebraic invariants included in his work "On the Theory of Linear Transformations" (1845), a theory that was extended by Sylvester.
Cayley paved the way for Felix Klein's discovery that all geometries can be explained by invariants of groups of transformations. Every geometric object is characterized by the group of transformations that make it invariant. Klein showed that Euclidean and non-Euclidean geometries are particular cases of a more general geometry: projective geometry. He presented this in 1872 in his famous "Erlangen program".
Bibliography
Aitken. A.C. (1956). Determinantes y Matrices. Editorial Dossat, 1965. El principal texto de referencia del tema. Un clásico.
Bell, E.T. Gemelos Invariantes. Cayley y Silvester. Capítulo 21 de Los Grandes Matemáticos. Desde Zenón a Poincaré. Su vida y sus obras. Editorial Losada, Buenos Aires, 1948.
Birkhoff, G. and MacLane, S. (1957). A Survey of Modern Algebra. The Macmillan Company. Un de los libros de texto clásicos, escrito por dos matemáticos de primer orden.
Cayley, Arthur. A memoir on the theory of matrices. Philosophical Transactions of the Royal Society of London 148: 17-37, 1858.
Crilly, Tony. Arthur Cayley: Mathematician Laureate of the Victorian Age. John Hopkins University Press, 2006.
Finkbeiner, D.T. (1960). Matrices and Linear Transformations. W.H.Freeman and Company. Un libro de texto estándar sobre álgebra lineal. Es comparable a Halmos (1958), pues cubre temas similares. Incluye buenos ejemplos y ejercicios.
Golub, G.H. and van Loan, C.F. (1996). Matrix Computations. The John Hopkins University Press. Un libro de texto estándar sobre el tema.
Halmos, P.R. (1958). Finite-Dimensional Vector Spaces. D. van Nostrand Company, Inc. Pese a su antigüedad, es uno de los mejores libros sobre espacios vectoriales finito-dimensionales. Está escrito en un estilo simple y coloquial.
Horn, R.A. and Johnson, C.R. (1999). Matrix Analysis. Cambridge University Press. Una de las referencias estándar sobre teoría de matrices. Buen compañero de Golub & van Loan (1996).
Mirsky, L. (1990). An Introduction to Linear Algebra. Dover Publications, Inc. Contiene un buen capítulo sobre matrices unitarias y ortogonales.
Parshall, Karen Hunger. James Joseph Sylvester, Jewish Mathematician in a Victorian World. John Hopkins University Press, 2006.
Roman, S. (1992). Advanced Linear Algebra. Springer-Verlag. Incluye un excelente capítulo sobre autovalores y autovectores.
Schwarz, H.R.; Rutishauser, H. and Stiefel, E. (1973). Numerical Analysis of Symmetric Matrices. Prentice-Hall, Inc. A pesar de su título, es una excelente introducción a los espacios vectoriales y al álgebra lineal.
Wilkinson, J.H. (1965). The Algebraic Eigenvalue problem. Clarendon Press. Es considerado como la Biblia de los autovalores.
Zhang, F. (1999). Matrix Theory: Basic Results and Techniques. Springer. Una referencia estándar moderna sobre matrices. Contiene un buen capítulo sobre matrices Hermíticas.