From e798466871ceef80a5bd78eba460735fca829a8c Mon Sep 17 00:00:00 2001 From: Gael Guennebaud Date: Wed, 11 Apr 2018 10:46:11 +0200 Subject: bug #1538: update manual pages regarding BDCSVD. --- doc/TutorialLinearAlgebra.dox | 28 ++++++++++++++++++++++++---- 1 file changed, 24 insertions(+), 4 deletions(-) (limited to 'doc/TutorialLinearAlgebra.dox') diff --git a/doc/TutorialLinearAlgebra.dox b/doc/TutorialLinearAlgebra.dox index cb92ceeae..a72724143 100644 --- a/doc/TutorialLinearAlgebra.dox +++ b/doc/TutorialLinearAlgebra.dox @@ -73,7 +73,7 @@ depending on your matrix and the trade-off you want to make: ColPivHouseholderQR colPivHouseholderQr() None - ++ + + - +++ @@ -85,6 +85,14 @@ depending on your matrix and the trade-off you want to make: - - +++ + + CompleteOrthogonalDecomposition + completeOrthogonalDecomposition() + None + + + - + +++ + LLT llt() @@ -101,15 +109,24 @@ depending on your matrix and the trade-off you want to make: + ++ + + BDCSVD + bdcSvd() + None + - + - + +++ + JacobiSVD jacobiSvd() None - - - + - - - - +++ +To get an overview of the true relative speed of the different decompositions, check this \link DenseDecompositionBenchmark benchmark \endlink. All of these decompositions offer a solve() method that works as in the above example. @@ -183,8 +200,11 @@ Here is an example: \section TutorialLinAlgLeastsquares Least squares solving -The most accurate method to do least squares solving is with a SVD decomposition. Eigen provides one -as the JacobiSVD class, and its solve() is doing least-squares solving. +The most accurate method to do least squares solving is with a SVD decomposition. +Eigen provides two implementations. +The recommended one is the BDCSVD class, which scale well for large problems +and automatically fall-back to the JacobiSVD class for smaller problems. +For both classes, their solve() method is doing least-squares solving. Here is an example: -- cgit v1.2.3