** Next:** Computing the inverse of
** Up:** Linear Systems of Algebraic
** Previous:** Singularity
** Contents**

`It is often the case that we need to solve consecutively several linear
systems that share the same coefficient matrix
`

`
We can save substantial amounts of CPU time by
computing the decomposition of the matrix once,
and then re-using it to obtain different solutions with different
right hand sides.
`
`In the standard solution, we applied to the right hand vector
all the permutations and row operations applied to ,
then solved the upper tringular system
`

`
To re-use the decomposition, we need to apply
the permutations and row operations (stored in and
respectively) to the new right hand vector
- that is, compute , then solve the
upper triangular system
`

`
Now, all permutations are stored in and all row operations
in , hence we can use this information directly.
In matrix language,
`

`For example, the system
`

`
has the same coefficient matrix as (), but the right hand side vector
is now instead of .
`
`Therefore, we use the decomposition () as follows.
`

`First, interchange rows
(elements) 1 and in the right hand side vector
`

`
Then, multiply first row by and subtract it from the second row,
and by and subtract it from the third row, to obtain
`

`
Note that and are now stored in and
respectively.
`
`Next, we need to interchange rows 2 and - nothing to do.
Multiply second row by and subtract it from the third row
`

`So far we have applied the permutations ()
and row transformations () to the new RHS vector .
It remains to solve the system
`

`
or
`

`
It is easy to see that
.
`

** Next:** Computing the inverse of
** Up:** Linear Systems of Algebraic
** Previous:** Singularity
** Contents**
Adrian Sandu
2001-08-26