Rating:  Summary: THE CLASSIC reference for matrix computations! Review: This book is an invaluable reference for anyone working in matrix computations or linear algebra. I have been using it for years and found it to be clear and comprehensive
Rating:  Summary: Great Mathematical Text Review: This book should be placed alongside "Principles of Mathematical Analysis" by Walter Rudin and "Finite Dimensional Vector Spaces" by Paul Halmos as a classic text, one which students/professionals of mathematics will use for years to come. A solid book covering computational matrix theory. I myself used it as a tool to bridge the gap between my formal training in Mathematics and my serious interest in computers. Reader should have some knowledge of basic linear algebra(ie understanding of vector spaces, L2 norms, etc..) before attempting this book. Excercises could be better. A good purchase for those with a more than passing interest.
Rating:  Summary: One of the best books on the subject Review: This is one of the definitive texts on computational linear algebra, or more specifically, on matrix computations. The term "matrix computations" is actually the more apt name because the book focuses on computational issues involving matrices,the currency of linear algebra, rather than on linear algebra in the abstract. As an example of this distinction, the authors develop both "saxpy" (scalar "a" times vector "x" plus vector "y") based algorithms and "gaxpy" (generalized saxpy, where "a" is a matrix) based algorithms, which are organized to exploit very efficient low-level matrix computations. This is an important organizing concept that can lead to more efficient matrix algorithms.For each important algorithm discussed, the authors provide a concise and rigorous mathematical development followed by crystal clear pseudo-code. The pseudo-code has a Pascal-like syntax, but with embedded Matlab abbreviations that make common low-level matrix operations extremely easy to express. The authors also use indentation rather than tedious BEGIN-END notation, another convention that makes the pseudo-code crisp and easy to understand. I have found it quite easy to code up various algorithms from the pseudo-code descriptions given in this book. The authors cover most of the traditional topics such as Gaussian elimination, matrix factorizations (LU, QR, and SVD), eigenvalue problems (symmetric and unsymmetric), iterative methods, Lanczos method, othogonalization and least squares (both constrained and unconstrained), as well as basic linear algebra and error analysis. I've use this book extensively during the past ten years. It's an invaluable resource for teaching numerical analysis (which invariably includes matrix computations), and for virtually any research that involves computational linear algebra. If you've got matrices, chances are you will appreciate having this book around.
Rating:  Summary: Got Matrices? Review: This is one of the definitive texts on computational linear algebra, or more specifically, on matrix computations. The term "matrix computations" is actually the more apt name because the book focuses on computational issues involving matrices,the currency of linear algebra, rather than on linear algebra in the abstract. As an example of this distinction, the authors develop both "saxpy" (scalar "a" times vector "x" plus vector "y") based algorithms and "gaxpy" (generalized saxpy, where "a" is a matrix) based algorithms, which are organized to exploit very efficient low-level matrix computations. This is an important organizing concept that can lead to more efficient matrix algorithms. For each important algorithm discussed, the authors provide a concise and rigorous mathematical development followed by crystal clear pseudo-code. The pseudo-code has a Pascal-like syntax, but with embedded Matlab abbreviations that make common low-level matrix operations extremely easy to express. The authors also use indentation rather than tedious BEGIN-END notation, another convention that makes the pseudo-code crisp and easy to understand. I have found it quite easy to code up various algorithms from the pseudo-code descriptions given in this book. The authors cover most of the traditional topics such as Gaussian elimination, matrix factorizations (LU, QR, and SVD), eigenvalue problems (symmetric and unsymmetric), iterative methods, Lanczos method, othogonalization and least squares (both constrained and unconstrained), as well as basic linear algebra and error analysis. I've use this book extensively during the past ten years. It's an invaluable resource for teaching numerical analysis (which invariably includes matrix computations), and for virtually any research that involves computational linear algebra. If you've got matrices, chances are you will appreciate having this book around.
Rating:  Summary: Got Matrices? Review: This is one of the definitive texts on computational linear algebra, or more specifically, on matrix computations. The term "matrix computations" is actually the more apt name because the book focuses on computational issues involving matrices,the currency of linear algebra, rather than on linear algebra in the abstract. As an example of this distinction, the authors develop both "saxpy" (scalar "a" times vector "x" plus vector "y") based algorithms and "gaxpy" (generalized saxpy, where "a" is a matrix) based algorithms, which are organized to exploit very efficient low-level matrix computations. This is an important organizing concept that can lead to more efficient matrix algorithms. For each important algorithm discussed, the authors provide a concise and rigorous mathematical development followed by crystal clear pseudo-code. The pseudo-code has a Pascal-like syntax, but with embedded Matlab abbreviations that make common low-level matrix operations extremely easy to express. The authors also use indentation rather than tedious BEGIN-END notation, another convention that makes the pseudo-code crisp and easy to understand. I have found it quite easy to code up various algorithms from the pseudo-code descriptions given in this book. The authors cover most of the traditional topics such as Gaussian elimination, matrix factorizations (LU, QR, and SVD), eigenvalue problems (symmetric and unsymmetric), iterative methods, Lanczos method, othogonalization and least squares (both constrained and unconstrained), as well as basic linear algebra and error analysis. I've use this book extensively during the past ten years. It's an invaluable resource for teaching numerical analysis (which invariably includes matrix computations), and for virtually any research that involves computational linear algebra. If you've got matrices, chances are you will appreciate having this book around.
Rating:  Summary: One of the best books on the subject Review: This is the book I turn to first when I have to deal with a problem in numerical linear algebra, it's clearly written and has extensive references.
Rating:  Summary: The Best Reference Text I've Seen on the Subject Review: When I need to solve a large system of linear equations or better understand an algorithm I am using, this book has proven to be the best place to go. It is broad in scope and the writing is clear.
|