Let #A# be a square matrix. In order to achieve a situation where expansion along a row or column is useful for calculating the determinant of #A#, we employ row reduction. This allows us to produce many zeros in the matrix while the determinant changes in a controlled way. As we have seen before, an elementary row operation on a matrix #A# may be seen as multiplication of #A# from the left by a square matrix, so the product formula for the determinant can be used. Thus, some bookkeeping is needed to record the determinants of matrices used while performing the row reduction. As we will see, since #\det(A) = \det(A^\top)#, column operations work just as well.
The effect of elementary row operations on a matrix #A# can be described as follows in terms of matrix multiplication.
- \(R_i\rightarrow \lambda \cdot R_i\;(\lambda \neq 0)\): Multiplication of row #i# of #A# by a number #\lambda # is equivalent to multiplication from the left by the diagonal matrix #D_{i,\lambda }# with #(i,i)#-entry equal to #\lambda # and all other entries on the diagonal equal to #1#.
- \(R_i\rightarrow R_i+\lambda \cdot R_j\;(i\ne j)\): Addition of the multiple #\lambda # of one row #j# to another row #i# is equivalent to multiplication from the left by the matrix #E_{ij,\lambda }#, whose #(i,j)#-entry is equal to #\lambda#, whose diagonal elements are equal to #1#, and all of whose other elements are equal to #0#.
- \(R_i\leftrightarrow R_j\;(i\ne j)\): Interchange of the rows #i# and #j# of #A# is equivalent to multiplying from the left by the permutation matrix #P_{(i,j)}# associated with the transposition #(i,j)#.
\[\begin{array}{r|c|c} \text{matrix}&\text{written out}&\text{left multiplication}\\ \hline D_{3,\lambda}&\matrix{1&0&0\\ 0&1&0\\ 0&0&\lambda}&\phantom{x}\color{blue}{\text{multiplication of row }3\text{ by }\lambda}\\ E_{31,\lambda} &\matrix{1&0&0\\ 0&1&0\\ \lambda&0&1}&\phantom{x}\color{blue}{\text{addition of scalar multiple }\lambda\text{ of row }1\text{ to row }3}\\ P_{(1,3)} &\matrix{0&0&1\\ 0&1&0\\ 1&0&0}&\phantom{x}\color{blue}{\text{interchange of rows }1\text{ and }3}\end{array}\]
We can perform column operations on #A# in the same way as row operations, by multiplying #A# from the right by the matrices #D_{i,\lambda}#, #E_{ij,\lambda}#, #P_{(i,j)}#. This can be seen by using the law \[\left(B\, A\right)^\top = A^\top B^\top \] and noting that \[\begin{array}{rcl} D_{i,\lambda}^\top&=& D_{i,\lambda} \\ E_{ij,\lambda}^\top &=&E_{ji,\lambda}\\ P_{(i,j)}^\top &=&P_{(i,j)} \end{array}\]
We apply this interpretation of row and column operations to compute determinants.
The effect of elementary row or column operations on the determinant of a square matrix #A# is indicated in the table below. If #B# is the matrix from the second column, then the determinant of the result #B\,A# or #A\,B# of the operation is equal to #\det(B)\cdot \det(A)#.
elementary operation |
matrix |
determinant |
\(R_i\rightarrow \lambda \cdot R_i\;(\lambda \neq 0)\) |
\(D_{i,\lambda }\) |
\(\lambda \) |
\(R_i\rightarrow R_i+\lambda \cdot R_j\;(i\ne j)\) |
\(E_{ij,\lambda }\) |
\(1\) |
\(R_i\leftrightarrow R_j\;(i\ne j)\) |
\(P_{(i,j)}\) |
\(-1\) |
Let #B# be one of the matrices #D_{i,\lambda }#, #E_{ij,\lambda }#, #P_{(i,j)}#. The product formula for the determinant gives #\det(B\,A) = \det(B)\cdot \det(A)#.
According to the definition of the determinant, the determinant of the diagonal matrix \(D_{i,\lambda }\) is the product of the diagonal elements, that is, #\lambda #.
According to Determinants of a few special matrices, the determinant of the triangular matrix \(E_{ij,\lambda }\) is the product of the diagonal elements, that is, #1#.
According to the definition of the determinant, the determinant of the permutation matrix \(P_{(i,j)}\) is the sign of the permutation #(i,j)#, that is, #-1#.
The determinant #\det(A)# can be calculated from #\det(B\,A)# as the quotient #\frac{\det(B\,A)}{\det(B)}#. Now #B\, A# is a matrix having more zeros than #A#, and #\det(B)# is determined by the row and column operations performed. Thus, some bookkeeping during row and column operations is required in order to find #\det(B)#.
Because #\det (A)=\det (A^\top)# we may, for the purpose of calculating the determinant of #A#, also perform elementary operations with columns. If we are concerned only with the value of the determinant, it may be useful to alternate row and column operations. If we use both types of operations, then we will lose the information about the row and column space.
We show how the determinant of the matrix \[A=\matrix{0 & 1 & 2 & -1\\
2 & 5 & -7 & 3\\
0 & 3 & 6 & 2\\
-2 & -5 & 4 & -2}\] can be calculated by row or column reduction and expansion along row or column. \[\begin{array}{rcl}
\det (A)&=&\left|\begin{array}{rrrr}
0 & 1 & 2 & -1\\
2 & 5 & -7 & 3\\
0 & 3 & 6 & 2\\
-2 & -5 & 4 & -2
\end{array}\,\right|\\ &&\phantom{xx}\color{blue}{\text{determinant of }A}\\ &=&\ \left|\begin{array}{rrrr}
0 & 1 & 2 & -1\\
2 & 5 & -7 & 3\\
0 & 3 & 6 & 2\\
0 & 0 & -3 & 1
\end{array}\,\right| \\&&\phantom{xx}\color{blue}{\text{the second row added to the fourth: }}\\&&\phantom{xx}\color{blue}{\text{multiplication from the left by }E_{42,1}}\\ &=&
\ -2\cdot\left|\begin{array}{rrr}
1 & 2 & -1\\
3 & 6 & 2\\
0 & -3 &1
\end{array}\,\right|\\&&\phantom{xx}\color{blue}{\text{expanded along the first column}}\\ & =&\ -2\cdot\left|\,\begin{array}{rrr}
1 & 2 & -1\\
0 & 0 & 5\\
0 & -3 & 1
\end{array}\,\right|\\&&\phantom{xx}\color{blue}{\text{three times the first row subtracted from the second: }}\\&&\phantom{xx}\color{blue}{\text{multiplication from the left by }E_{21,-3}}\\
&=&\ 10\cdot\left|\,\begin{array}{rr}
1 & 2\\
0 & -3
\end{array}\,\right|\\ &&\phantom{xx}\color{blue}{\text{expanded along the second row }}\\ &=&\ -30\\ &&\phantom{xx}\color{blue}{\text{the }(2\times 2)\text{-determinant worked out}}\end{array}
\]
We show how the determinant of the matrix \[A=\matrix{2 & 0 & 0 & 8\\
1 & -7 & -5 & 0\\
3 & 8 & 6 & 0\\
0 & 7 & 5 & 4}\]can be calculated by reduction and expansion along row or column.
\[\begin{array}{rcl}
\det (A)&=& \left|\,\begin{array}{rrrr}
2 & 0 & 0 & 8\\
1 & -7 & -5 & 0\\
3 & 8 & 6 & 0\\
0 & 7 & 5 & 4
\end{array}\,\right|\\&&\phantom{xx}\color{blue}{\text{the determinant of }A}\\ &=&\ 2\left|\,\begin{array}{rrrr}
1 & 0 & 0 & 4\\
1 & -7 &-5 & 0\\
3 & 8 & 6 & 0\\
0 & 7 & 5 & 4
\end{array}\,\right|\\ &&\phantom{xx}\color{blue}{\text{factor 2 pulled out of the first row: multiplication from the left by }D_{1,\frac{1}{2}}}\\
& =&\ 8\left|\,\begin{array}{rrrr}
1 & 0 & 0 & 1\\
1 & -7 & -5 & 0\\
3 & 8 & 6 & 0\\
0 & 7 & 5 & 1
\end{array}\,\right|\\&&\phantom{xx}\color{blue}{\text{factor 4 pulled out of fourth column: multiplication from the right by }D_{4,\frac{1}{4}}}\\ & =&\ 8\left|\,\begin{array}{rrrr}
1 & 0 & 0 & 1\\
1 & -7 & -5 & 0\\
3 & 8 & 6 & 0\\
-1 & 7 & 5 & 0
\end{array}\,\right|\\&&\phantom{xx}\color{blue}{\text{row 1 subtracted from row 4: multiplication from the left by }E_{41,-1}}\\
& =&-8 \left|\,\begin{array}{rrr}
1 & -7 & -5\\
3 & 8 & 6\\
-1 & 7 & 5
\end{array}\,\right|\\ &&\phantom{xx}\color{blue}{\text{expanded along the last column}}\\ &=& 0\\ &&\phantom{xx}\color{blue}{\text{first and third row are dependent}}\\
\end{array}
\]
Let \(A\) and #C# be the following \((3\times4)\)-matrices: \[A=\matrix{0 & -1 & -5 & 9 \\ 0 & -1 & -9 & 9 \\ -6 & 8 & -3 & 3 \\ }\qquad\text{and}\qquad C = \matrix{0 & 1 & 13 & -9 \\ 0 & -1 & -9 & 9 \\ -6 & 8 & -3 & 3 \\ }\] By what type of elementary operation is #C# obtained from #A#?
Addition of a multiple of one row to another
In formula form, the operation is \(R_1\rightarrow R_1-2\cdot R_2\) . This is the same as multiplying the matrix #A# from the left by \[E_{1 2,-2} =\matrix{1 & -2 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ }\]