Search Blogs

Thursday, March 14, 2019

Matrices


A matrix is an array of numbers arranged with $m$-rows and $n$-columns. When $m=n$ the matrix is termed square. The index of an element in a matrix is notated using the indices $i$ and $j$ for the rows and columns, respectively. For example:

$$\mathbf{B}=
\begin{bmatrix}
b_{11} & b_{12} \\
b_{21} & b_{22} \\
\end{bmatrix}.
$$

The algebra of matrices is straightforward, for example, addition occurs by adding same indexed elements:

$$\mathbf{B}+\mathbf{C} =
\begin{bmatrix}
b_{11}+c_{11} & b_{12}+c_{12} \\
b_{21}+c_{21} & b_{22}+c_{22} \\
\end{bmatrix}
.$$

We can write the addition operation in more compact form using implicit index notation, e.g., $\mathbf{B}+\mathbf{C}=b_{ij}+c_{ij}$. Keep in mind that the dimensions of the matrices must be the same for addition or subtraction operations.

Multiplication of matrices by a scalar quantity is simply, $S\cdot\mathbf{B}$. When multiplying two matrices, the inner dimensions must be the same, for example, if $\mathbf{B}$ is a $m \times n$ then $\mathbf{C}$ must be a $n \times p$. In other words, the number of columns in $\mathbf{B}$ must be equal to the number of rows in $\mathbf{C}$. The operation is written in compact as follows:

$$ c_{jk} = \sum_{i=1}^{n} a_{ji}b_{ik}.$$

Here are some examples, $c_{11} = a_{11}b_{11}+a_{12}b_{21}$ and $c_{12} = a_{11}b_{12} + a_{12}b_{22}$. An important operation/transformation of matrices is the transpose, which is the process of switching elements in the rows and columns. The transpose is indicated with a superscript capital "T", e.g., $\mathbf{B}^T$. For a square matrix, the diagonal components are commonly referred to as the principal terms and the sum of them is the trace of the matrix. For square matrices that have ones as diagonal terms and zeros as off-diagonal terms, we refer to them as unit of identify matrices, for example:

$$\mathbf{I}=\begin{bmatrix}
1 & 0 & 0 \\
0 & 1 & 0 \\
0 & 0 & 1 \\
\end{bmatrix}.$$

Identity matrices are commonly represented with $\mathbf{I}$. Another important property of square matrices is invertibility. A matrix is said to have an inverse if it satisfies the following condition:

$$\mathbf{A}\mathbf{B} = \mathbf{I},$$

we call $\mathbf{B}$ the inverse of $\mathbf{A}$. If a square matrix doesn't have an inverse it is referred to as singular. Matrices that have the inverse which satisfies the condition:

$$\mathbf{A}^{T}\mathbf{A}=\mathbf{I},$$

are orthogonal matrices. A determinant is a specific value that can be computed for square matrices and can be thought of as the scaling factor for linear transformations. The determinant is typically written as:

$$\text{det}\, \mathbf{B} = \Delta \mathbf{B} = |\mathbf{B}|,$$

For 2x2 matrices the determinant is determined by taking the product of the diagonals and subsequent difference between them, as shown below:

$$|\mathbf{B}| =
\begin{vmatrix}
b_{11} & b_{12} \\
b_{21} & b_{22} \\
\end{vmatrix} = b_{11}b_{22}-b_{21}b_{12}.$$

Another method for finding the determinant is using the Laplace expansion method. Determinants have the following commuting properties:

$$|\mathbf{A}\mathbf{B}| = |\mathbf{A}||\mathbf{B}|,$$
$$|\mathbf{A}| = |\mathbf{A}^{T}|.$$

A matrix that is singular will have a determinant that evaluates to zero, thus this is a good way for identifying if a matrix has an inverse.

Matrices and determinants are an important component to the field of  linear algebra, so having good command over them is a serious advantage. I know there are a lot of keywords presented in this post that were not given a formal mathematical description but I strongly suggest clicking the link and reading on to get a better sense of there use in linear algebra.

For this post we will provide a quote from a mathematician who did a lot of research in advance algebra and group theories.

"We [Kaplansky and Halmos] share a philosophy about linear algebra: we think basis-free, we write basis-free, but when the chips are down we close the office door and compute with matrices like fury."
-Irving Kaplansky, Paul Halmos: Celebrating 50 Years of Mathematics


References & Additional Reading

Reuse and Attribution