We exploit the fact that row transformations do not change the solution of the system and transform the system succesively as follows.
In matrix notation, multiplying
the first row by and subtracting
it from the second row leads to
We store the multiplier as follows. Take a matrix ;
since the multiplier is used to
cancel , we store it into .
Multiplying the first row by and subtracting it from the third gives
Finally, by multiplying the second row by and subtracting it from the third
we obtain
(14.6) |
Now, all the elements in the sub-diagonal positions of are zero, and only the elements above the main diagonal are left. This transformed is in ``upper triangular form''; we usually denote it by . In the same time, the matrix of multipliers has only zero elements above the main diagonal; it is in ``lower triangular form'' ( stands for lower). Note that all the diagonal elements of are , while the diagonal elements of can take any values, without any restriction.
If we multiply and we obtain the original matrix
(14.7) |
The relation is called the ``LU decomposition'' of A.
Since has zeros below the main diagonal, we can use this space to store
the elements of ; therefore, we can represent the LU decomposition
compactly as