diff options
author | Jitse Niesen <jitse@maths.leeds.ac.uk> | 2014-01-18 01:16:17 +0000 |
---|---|---|
committer | Jitse Niesen <jitse@maths.leeds.ac.uk> | 2014-01-18 01:16:17 +0000 |
commit | aa0db35185f7eda94eb103b6bb92630c432512e5 (patch) | |
tree | 65896c945099ddba0d913646c57000e153e20e28 /doc/TutorialLinearAlgebra.dox | |
parent | a58325ac2f7be7326be358ac51c4f0eebcf7fbf9 (diff) |
Add doc page on computing Least Squares.
Diffstat (limited to 'doc/TutorialLinearAlgebra.dox')
-rw-r--r-- | doc/TutorialLinearAlgebra.dox | 11 |
1 files changed, 6 insertions, 5 deletions
diff --git a/doc/TutorialLinearAlgebra.dox b/doc/TutorialLinearAlgebra.dox index b09f3543e..e6c41fd70 100644 --- a/doc/TutorialLinearAlgebra.dox +++ b/doc/TutorialLinearAlgebra.dox @@ -167,8 +167,8 @@ Here is an example: \section TutorialLinAlgLeastsquares Least squares solving -The best way to do least squares solving is with a SVD decomposition. Eigen provides one as the JacobiSVD class, and its solve() -is doing least-squares solving. +The most accurate method to do least squares solving is with a SVD decomposition. Eigen provides one +as the JacobiSVD class, and its solve() is doing least-squares solving. Here is an example: <table class="example"> @@ -179,9 +179,10 @@ Here is an example: </tr> </table> -Another way, potentially faster but less reliable, is to use a LDLT decomposition -of the normal matrix. In any case, just read any reference text on least squares, and it will be very easy for you -to implement any linear least squares computation on top of Eigen. +Another methods, potentially faster but less reliable, are to use a Cholesky decomposition of the +normal matrix or a QR decomposition. Our page on \link LeastSquares least squares solving \endlink +has more details. + \section TutorialLinAlgSeparateComputation Separating the computation from the construction |