There is a surprising result involving matrices associated
with this elimination process. Introduce the
upper triangular matrix
which resulted from the elimination process. Then
introduce the lower triangular matrix
This uses the multipliers introduced in the elimination
In general, when the process of Gaussian elimination
without pivoting is applied to solving a linear system
Ax = b, we obtain A = LU with L and U constructed
For the case in which partial pivoting is used, we obtain
the slightly modified result
LU = P A
where L and U are constructed as before and P is a
permutation matrix. For example, consider
The matrix P A is obtained from A by switching around
rows of A. The result LU = P A means that the LU -
factorization is valid for the matrix A with its rows
Consequences: If we have a factorization
A = LU
with L lower triangular and U upper triangular, then
we can solve the linear system Ax = b in a relatively
The linear system can be written as
LU x = b
Write this as a two stage process:
Lg = b, U x = g
The system Lg = b is a lower triangular system
We solve it by “forward substitution”. Then we solve
the upper triangular system U x = g by back substitution.
VARIANTS OF GAUSSIAN ELIMINATION
If no partial pivoting is needed, then we can look for
A = LU
without going thru the Gaussian elimination process.
For example, suppose A is 4 × 4. We write
To find the elements,
the right side matrices L and U and match the results
with the corresponding elements in A.
Multiplying the first row of L times all of the columns
of U leads to
Then multiplying rows 2, 3, 4 times the first column
of U yields
and we can solve for.
We can continue
this process, finding the second row of U and
then the second column of L, and so on. For example,
to solve for
we need to solve for it in
Why do this? A hint of an answer is given by this
last equation. If we had an n × n matrix A, then we
by solving for it in the equation
Embedded in this formula we have a dot product. This
is in fact typical of this process, with the length of the
inner products varying from one position to another.
Recalling §2.4 and the discussion of dot products, we
can evaluate this last formula by using a higher precision arithmetic and thus avoid many rounding errors.
This leads to a variant of Gaussian elimination
in which there are far fewer rounding errors.
With ordinary Gaussian elimination, the number of
rounding errors is proportional to n ^3. This reduces
the number of rounding errors, with the number now
being proportional to only n^2. This can lead to major
increases in accuracy, especially for matrices A which
are very sensitive to small changes.
Start solving your Algebra Problems
in next 5 minutes!
Download (and optional CD)
Click to Buy Now:
2Checkout.com is an authorized reseller
of goods provided by Sofmath
Attention: We are
currently running a special promotional offer
for Algebra-Answer.com visitors -- if you order
Algebra Helper by midnight of
you will pay only $39.99
instead of our regular price of $74.99 -- this is $35 in
savings ! In order to take advantage of this
offer, you need to order by clicking on one of
the buttons on the left, not through our regular
If you order now you will also receive 30 minute live session from tutor.com for a 1$!
You Will Learn Algebra Better - Guaranteed!
Just take a look how incredibly simple Algebra Helper is:
: Enter your homework problem in an easy WYSIWYG (What you see is what you get) algebra editor:
Step 2 :
Let Algebra Helper solve it:
Step 3 : Ask for an explanation for the steps you don't understand:
Algebra Helper can solve problems in all the following areas:
simplification of algebraic expressions (operations
with polynomials (simplifying, degree, synthetic division...), exponential expressions, fractions and roots
(radicals), absolute values)