The interplay of columns and rows is the heart of linear algebra. It‘s not totally easy, but it‘s not too hard. Here are four of the central ideas:
1. The column space (all combinations of the columns). 2. The row space (all combinations of the rows). 3. The rank (the number of independent columns) (or rows). 4. Elimination (the good way to find the rank of a matrix).
I will stop here, so you can start the course. - P6
The Breakdown of Elimination
Under what circumstances could the process break down? Something must go wrongin the singular case, and something might go wrong in the nonsingular case. This may seem a little premature-after all, we have barely got the algorithm working. But thepossibility of breakdown sheds light on the method itself. - P13
Section 1.5 will discuss row exchanges when the system is not singular. Then the ex-changes produce a full set of pivots. Chapter 2 admits the singular case, and limps forward with elimination. - P13
The Cost of Elimination
Our other question is very practical. How many separate arithmetical operations doeselimination require, for n equations in n unknowns? If n is large, a computer is going totake our place in carrying out the elimination. Since all the steps are known, we shouldbe able to predict the number of operations. - P14
Suppose we call each division, and each multiplication-subtraction, one operation. In column 1, it takes n operations for every zero we achieve-one to find the multiple l, and the other to find the new entries along the row. There are n 1 rows underneath thefirst one, so the first stage of elimination needs n(n - 1) = n² -n operations. (Another approach to n² - n is this: All n² entries need to be changed, except the n in the firstrow.) Later stages are faster because the equations are shorter. - P14
|