Quiz Chapter 07: LU Decomposition Method MULTIPLE CHOICE TEST (All Tests) LU DECOMPOSITION (More on LU Decomposition) Pick the most appropriate answer 1. LU decomposition method is computationally more efficient than Naïve Gauss elimination for solving a single set of simultaneous linear equations multiple sets of simultaneous linear equations with different coefficient matrices and same right-hand side vectors multiple sets of simultaneous linear equations with same coefficient matrix and different right-hand side vectors less than ten simultaneous linear equations 2. The lower triangular matrix \left[ L \right] in the \left[ L \right] \left[ U \right] decomposition of the matrix given below \begin{bmatrix} 25&5&4 \\ 10&8&16 \\ 8&12&22 \\ \end{bmatrix} = \begin{bmatrix} 1&0&0 \\ l_{21}&1&0 \\ l_{31}&l_{32}&1 \\ \end{bmatrix} \begin{bmatrix} u_{11}&u_{12}&u_{13} \\ 0&u_{22}&u_{23} \\ 0&0&u_{33} \\ \end{bmatrix} is \begin{bmatrix}1&0&0 \\ 0.40000&1&0 \\ 0.32000&1.7333&1 \\ \end{bmatrix} \begin{bmatrix}25&5&4 \\ 0&6&14.400 \\ 0&0&-4.2400 \\ \end{bmatrix} \begin{bmatrix}1&0&0 \\ 10&1&0 \\ 8&12&0 \\ \end{bmatrix} \begin{bmatrix}1&0&0 \\ 0.40000&1&0 \\ 0.32000&1.5000&1 \\ \end{bmatrix} 3. The upper triangular matrix \left[ U \right] in the \left[ L \right] \left[ U \right] decomposition of the matrix given below \begin{bmatrix} 25&5&4 \\ 10&8&16 \\ 8&12&22 \\ \end{bmatrix} = \begin{bmatrix} 1&0&0 \\ l_{21}&1&0 \\ l_{31}&l_{32}&1 \\ \end{bmatrix} \begin{bmatrix} u_{11}&u_{12}&u_{13} \\ 0&u_{22}&u_{23} \\ 0&0&u_{33} \\ \end{bmatrix} is \begin{bmatrix}1&0&0 \\ 0.40000&1&0 \\ 0.32000&1.7333&1 \\ \end{bmatrix} \begin{bmatrix}25&5&4 \\ 0&6&14.400 \\ 0&0&-4.2400 \\ \end{bmatrix} \begin{bmatrix}25&5&4 \\ 0&8&16 \\ 0&0&-2 \\ \end{bmatrix} \begin{bmatrix}1&0.2000&0.16000 \\ 0&1&2.4000 \\ 0&0&-4.240 \\ \end{bmatrix} 4. For a given 2000 \times 2000 matrix \left[ A \right], assume that it takes about 15 seconds to find the inverse of \left[ A \right] by the use of the \left[ L \right] \left[ U \right] decomposition method, that is, finding the \left[ L \right] \left[ U \right] once, and then doing forward substitution and back substitution 2000 times using the 2000 columns of the identity matrix as the right hand side vector. The approximate time, in seconds, that it will take to find the inverse if found by repeated use of Naïve Gauss Elimination method, that is, doing forward elimination and back substitution 2000 times by using the 2000 columns of the identity matrix as the right hand side vector is 300 1500 7500 30000 5. The algorithm for solving the set of n equations \left[ A \right] \left[ X \right] = \left[ C \right], where \left[ A \right] = \left[ L \right] \left[ U \right] involves solving \left[ L \right] \left[ Z \right] = \left[ C \right] by forward substitution. The algorithm to solve \left[ L \right] \left[ Z \right] = \left[ C \right] is given by z_{1}=\tfrac{c_{1}}{l_{11}} for i from 2 to n do sum = 0 for j from 1 to i do sum = sum +l_{ij} \, * \, z_{j} end do z_{i}=( c_{i} - sum ) / l_{ii} end do z_{1}=\tfrac{c_{1}}{l_{11}} for i from 2 to n do sum = 0 for j from 1 to \left( i-1 \right) do sum = sum + l_{ij} \, * \, z_{j} end do z_{i}= ( c_{i} - sum) / l_{ii} end do z_{1}=\tfrac{c_{1}}{l_{11}} for i from 2 to n do for j from 1 to \left( i-1 \right) do sum = sum + l_{ij} \, * \, z_{j} end do z_{i}=( c_{i} - sum ) / l_{ii} end do for i from 2 to n do sum = 0 for j from 1 to \left( i-1 \right) do sum = sum +l_{ij} \, * \, z_{j} end do z_{i}=( c_{i} - sum ) / l_{ii} end do 6. To solve boundary value problems, the finite difference methods are used resulting in simultaneous linear equations with tridiagonal coefficient matrices. These are solved using the specialized [L][U] decomposition method. The set of equations in matrix form with a tridiagonal coefficient matrix for \dfrac{d^{2}y}{dx^{2}} = 6x - 0.5x^{2}, \, y \left( 0 \right) = 0, \, y \left( 12 \right) = 0, using the finite difference method with a second order accurate central divided difference method and a step size of h=4 is \begin{bmatrix} 1&0&0&0 \\ 0.0625&0.125&0.0625&0 \\ 0&0.0625&0.125&0.0625 \\ 0&0&0&1 \\ \end{bmatrix} \begin{bmatrix} y_{1} \\ y_{2} \\ y_{3} \\ y_{4} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 16.0 \\ 16.0 \\ 0 \\ \end{bmatrix} \begin{bmatrix} 1&0&0&0 \\ 0.0625&-0.125&0.0625&0 \\ 0&0.0625&-0.125&0.0625 \\ 0&0&0&1 \\ \end{bmatrix} \begin{bmatrix} y_{1} \\ y_{2} \\ y_{3} \\ y_{4} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 16.0 \\ 16.0 \\ 0 \\ \end{bmatrix} \begin{bmatrix} 1&0&0&0 \\ 0&0&0&1 \\ 0.0625&-0.125&0.0625&0 \\ 0&0.0625&-0.125&0.0625 \\ \end{bmatrix} \begin{bmatrix} y_{1} \\ y_{2} \\ y_{3} \\ y_{4} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 16.0 \\ 16.0 \\ \end{bmatrix} \begin{bmatrix} 1&0&0&0 \\ 0&0&0&1 \\ 0.0625&0.125&0.0625&0 \\ 0&0.0625&0.125&0.0625 \\ \end{bmatrix} \begin{bmatrix} y_{1} \\ y_{2} \\ y_{3} \\ y_{4} \\ \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 16.0 \\ 16.0 \\ \end{bmatrix} Loading …