Skip to main content

Archetype F Archetype F

⬜  Summary   System with four equations, four variables. Nonsingular coefficient matrix. Integer eigenvalues, one has “high” multiplicity.
⬜  Definition  A system of linear equations (Definition SLE).
\begin{align*} 33x_1 - 16x_2 + 10x_3 - 2x_4 &= -27 \\ 99x_1 - 47x_2 + 27x_3 - 7x_4 &= -77 \\ 78x_1 - 36x_2 + 17x_3 - 6x_4 &= -52 \\ -9x_1 + 2x_2 + 3x_3 +4x_4 &= 5 \end{align*}
⬜  Solutions  Some solutions to the system of linear equations, not necessarily exhaustive (Definition SSLE):
\begin{gather*} x_1 = 1,\quad x_2 = 2,\quad x_3 = -2,\quad x_4 = 4 \end{gather*}
⬜  Augmented Matrix  Augmented matrix of the linear system of equations (Definition AM):
\begin{equation*} \begin{bmatrix} 33 & -16 & 10 & -2 & -27\\ 99 & -47 & 27 & -7 & -77\\ 78 & -36 & 17 & -6 & -52\\ -9 & 2 & 3 & 4 & 5 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Augmented Matrix  Matrix in reduced row-echelon form, row-equivalent to the augmented matrix. (Definition RREF)
\begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 & 0 & 1\\ 0 & \leading{1} & 0 & 0 & 2\\ 0 & 0 & \leading{1} & 0 & -2 \\ 0 & 0 & 0 & \leading{1} & 4 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis  Analysis of the augmented matrix (Definition RREF).
\begin{align*} r&=4&D&=\set{1,\,2,\,3,\,4}&F&=\set{5} \end{align*}
⬜  Vector Form of Solutions  Vector form of the solution set to the system of equations (Theorem VFSLS). Notice the relationship between the free variables and the set \(F\) above. Also, notice the pattern of 0's and 1's in the entries of the vectors corresponding to elements of the set \(F\) in the larger examples.
\begin{equation*} \colvector{x_1\\x_2\\x_3\\x_4}=\colvector{1\\2\\-2\\4} \end{equation*}
⬜  Associated Homogeneous System  Given a system of equations we can always build a new, related, homogeneous system (Definition HS) by converting the constant terms to zeros and retaining the coefficients of the variables. Properties of this new system will have precise relationships with various properties of the original system.
\begin{align*} 33x_1 - 16x_2 + 10x_3 - 2x_4 &= 0 \\ 99x_1 - 47x_2 + 27x_3 - 7x_4 &= 0 \\ 78x_1 - 36x_2 + 17x_3 - 6x_4 &= 0 \\ -9x_1 + 2x_2 + 3x_3 +4x_4 &= 0 \end{align*}
⬜  Solutions, Homogeneous System  Some solutions to the associated homogeneous system of linear equations, not necessarily exhaustive (Definition SSLE). Review Theorem HSC as you consider these solutions.
\begin{gather*} x_1 = 0,\quad x_2 = 0,\quad x_3 = 0,\quad x_4=0 \end{gather*}
⬜  Row-Reduced Augmented Matrix, Homogeneous System  Form the augmented matrix of the homogeneous linear system, and use row operations to convert to reduced row-echelon form. Notice how the entries of the final column remain zeros.
\begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 & 0 & 0\\ 0 & \leading{1} & 0 & 0 & 0\\ 0 & 0 & \leading{1} & 0 & 0 \\ 0 & 0 & 0 & \leading{1} & 0 \end{bmatrix} \end{equation*}
⬜  Augmented Matrix Analysis, Homogeneous System  Analysis of the augmented matrix for the homogeneous system (Definition RREF). Compare this with the same analysis of the original system, especially in the case where the original system is inconsistent (Theorem RCLS).
\begin{align*} r&=4&D&=\set{1,\,2,\,3,\,4}&F&=\set{5} \end{align*}
⬜  Coefficient Matrix  For any system of equations we can isolate the coefficient matrix, which will be identical to the coefficient matrix of the associated homogeneous system. For the remainder of the discussion of this system of equations, we will analyze just the coefficient matrix.
\begin{equation*} \begin{bmatrix} 33 & -16 & 10 & -2 \\ 99 & -47 & 27 & -7 \\ 78 & -36 & 17 & -6 \\ -9 & 2 & 3 & 4 \end{bmatrix} \end{equation*}
⬜  Row-Reduced Coefficient Matrix  Row-equivalent matrix in reduced row-echelon form (Definition RREF).
\begin{equation*} \begin{bmatrix} \leading{1} & 0 & 0 & 0 \\ 0 & \leading{1} & 0 & 0 \\ 0 & 0 & \leading{1} & 0 \\ 0 & 0 & 0 & \leading{1} \end{bmatrix} \end{equation*}
⬜  Coefficient Matrix Analysis  Analysis of the reduced row-echelon form of the matrix (Definition RREF). For archetypes that begin as systems of equations, compare this analysis with the analysis for the coefficient matrices of the original system, and of the associated homogeneous system.
\begin{align*} r&=4&D&=\set{1,\,2,\,3,\,4}&F&=\set{\ } \end{align*}
⬜  Nonsingular Matrix?  Is the matrix nonsingular or singular? Nonsingular. Notice that the row-reduced version of the matrix is the identity matrix and apply Theorem NMRRI. At the same time, examine the sizes of the sets \(D\) and \(F\) from the analysis of the reduced row-echelon version of the matrix.
⬜  Null Space  The null space of the matrix. The set of vectors used in the span construction is a linearly independent set of column vectors that spans the null space of the matrix (Theorem SSNS, Theorem BNS). Solve a homogeneous system with this matrix as the coefficient matrix and write the solutions in vector form (Theorem VFSLS) to see these vectors arise. Compare the entries of these vectors for indices in \(D\) versus entries for indices in \(F\text{.}\)
\begin{equation*} \set{\ } \end{equation*}
⬜  Column Space, Original Columns  The column space of the matrix, expressed as the span of a set of linearly independent vectors that are also columns of the matrix. These columns have indices that form the set \(D\) above (Theorem BCS).
\begin{equation*} \set{\colvector{33\\99\\78\\-9},\,\colvector{-16\\-47\\-36\\2},\,\colvector{10\\27\\17\\3},\,\colvector{-2\\-7\\-6\\4}} \end{equation*}
⬜  Column Space, Extended Echelon Form  The column space of the matrix, as it arises from the extended echelon form of the matrix. The matrix \(L\) is computed as described in Definition EEF. This is followed by the column space described as the span of a set of linearly independent vectors that equals the null space of \(L\text{,}\) computed as according to Theorem FS and Theorem BNS. When \(r=m\text{,}\) the matrix \(L\) has no rows and the column space is all of \(\complex{m}\text{.}\)
\begin{equation*} L=\begin{bmatrix}\end{bmatrix} \end{equation*}
\begin{equation*} \set{\colvector{1\\0\\0\\0},\,\colvector{0\\1\\0\\0},\,\colvector{0\\0\\1\\0},\,\colvector{0\\0\\0\\1}} \end{equation*}
⬜  Column Space, Row Space of Transpose  The column space of the matrix, expressed as the span of a set of linearly independent vectors. These vectors are computed by bringing the transpose of the matrix into reduced row-echelon form, tossing out the zero rows, and writing the remaining nonzero rows as column vectors. By Theorem CSRST and Theorem BRS, and in the style of Example CSROI, this yields a linearly independent set of vectors that span the column space.
\begin{equation*} \set{\colvector{1\\0\\0\\0},\,\colvector{0\\1\\0\\0},\,\colvector{0\\0\\1\\0},\,\colvector{0\\0\\0\\1}} \end{equation*}
⬜  Row Space  Row space of the matrix, expressed as a span of a set of linearly independent vectors, obtained from the nonzero rows of the row-equivalent matrix in reduced row-echelon form. (Theorem BRS)
\begin{equation*} \set{\colvector{1\\0\\0\\0},\,\colvector{0\\1\\0\\0},\,\colvector{0\\0\\1\\0},\,\colvector{0\\0\\0\\1}} \end{equation*}
⬜  Inverse Matrix?  The matrix is nonsingular, and by Theorem NI has an inverse (Definition MI). The inverse can be computed with the procedure in Theorem CINM.
\begin{equation*} \begin{bmatrix} -\left( \frac{86}{3} \right) & \frac{38}{3} & -\left( \frac{11}{3} \right) & \frac{7} {3} \\ -\left( \frac{129}{2} \right) & \frac{86}{3} & -\left( \frac{17}{2} \right) & \frac{31}{6} \\ -13 & 6 & -2 & 1 \\ -\left( \frac{45}{2} \right) & \frac{29}{3} & -\left( \frac{5}{2} \right) & \frac{13}{6} \end{bmatrix} \end{equation*}
⬜  Subspace Dimensions  Subspace dimensions associated with the matrix (Definition ROM, Definition NOM). Verify Theorem RPNC.
\begin{align*} \text{rank}&=4&\text{nullity}&=0&\text{columns}&=4 \end{align*}
⬜  Determinant  Value of the determinant of the matrix. The matrix is nonsingular so the determinant is nonzero (Theorem SMZD). Notice that zero is not an eigenvalue of the matrix (Theorem SMZE).
\begin{equation*} \text{determinant}=-18 \end{equation*}
⬜  Eigenvalues, Eigenspaces  Eigenvalues, and bases for eigenspaces (Definition EEM, Definition EM). Compute a matrix-vector product (Definition MVP) for each eigenvector as an interesting check.
\begin{align*} \eigensystem{F}{-1}{\colvector{1\\2\\0\\1}}\\ \eigensystem{F}{2}{\colvector{2\\5\\2\\1}}\\ \eigensystem{F}{3}{\colvector{1\\1\\0\\7},\,\colvector{17\\45\\21\\0}}\\ %& forces align environment \end{align*}
⬜  Eigenvalue Multiplicities  Geometric and algebraic multiplicities (Definition GME, Definition AME).
\begin{align*} \geomult{F}{-1}&=1&\algmult{F}{-1}&=1\\ \geomult{F}{2}&=1&\algmult{F}{2}&=1\\ \geomult{F}{3}&=2&\algmult{F}{3}&=2 \end{align*}
⬜  Diagonalizable  Diagonalizable (Definition DZM)?

Yes, full eigenspaces, Theorem DMFE.

⬜  Diagonalization  The diagonalization (Theorem DC).
\begin{equation*} \begin{bmatrix}12&-5&1&-1\\-39&18&-7&3\\ \frac{27}{7}&-\frac{13}{7}&\frac{6}{7}&-\frac{1}{7}\\ \frac{26}{7}&-\frac{12}{7}&\frac{5}{7}&-\frac{2}{7} \end{bmatrix}\begin{bmatrix} 33 & -16 & 10 & -2 \\ 99 & -47 & 27 & -7 \\ 78 & -36 & 17 & -6 \\ -9 & 2 & 3 & 4 \end{bmatrix} \begin{bmatrix}1&2&1&17\\2&5&1&45\\0&2&0&21\\1&1&7&0 \end{bmatrix}=\begin{bmatrix}-1&0&0&0\\0&2&0&0\\0&0&3&0\\0&0&0&3 \end{bmatrix} \end{equation*}