diff options
-rw-r--r-- | doc/LeastSquares.dox | 2 | ||||
-rw-r--r-- | doc/TopicLinearAlgebraDecompositions.dox | 14 | ||||
-rw-r--r-- | doc/TutorialLinearAlgebra.dox | 28 | ||||
-rw-r--r-- | doc/examples/TutorialLinAlgSVDSolve.cpp | 2 |
4 files changed, 39 insertions, 7 deletions
diff --git a/doc/LeastSquares.dox b/doc/LeastSquares.dox index e2191a22f..24dfe4b4f 100644 --- a/doc/LeastSquares.dox +++ b/doc/LeastSquares.dox @@ -16,7 +16,7 @@ equations is the fastest but least accurate, and the QR decomposition is in betw \section LeastSquaresSVD Using the SVD decomposition -The \link JacobiSVD::solve() solve() \endlink method in the JacobiSVD class can be directly used to +The \link BDCSVD::solve() solve() \endlink method in the BDCSVD class can be directly used to solve linear squares systems. It is not enough to compute only the singular values (the default for this class); you also need the singular vectors but the thin SVD decomposition suffices for computing least squares solutions: diff --git a/doc/TopicLinearAlgebraDecompositions.dox b/doc/TopicLinearAlgebraDecompositions.dox index 991f964cc..0965da872 100644 --- a/doc/TopicLinearAlgebraDecompositions.dox +++ b/doc/TopicLinearAlgebraDecompositions.dox @@ -4,7 +4,7 @@ namespace Eigen { This page presents a catalogue of the dense matrix decompositions offered by Eigen. For an introduction on linear solvers and decompositions, check this \link TutorialLinearAlgebra page \endlink. -To get an overview of the true relative speed of the different decomposition, check this \link DenseDecompositionBenchmark benchmark \endlink. +To get an overview of the true relative speed of the different decompositions, check this \link DenseDecompositionBenchmark benchmark \endlink. \section TopicLinAlgBigTable Catalogue of decompositions offered by Eigen @@ -114,6 +114,18 @@ To get an overview of the true relative speed of the different decomposition, ch <tr><th class="inter" colspan="9">\n Singular values and eigenvalues decompositions</th></tr> <tr> + <td>BDCSVD (divide \& conquer)</td> + <td>-</td> + <td>One of the fastest SVD algorithms</td> + <td>Excellent</td> + <td>Yes</td> + <td>Singular values/vectors, least squares</td> + <td>Yes (and does least squares)</td> + <td>Excellent</td> + <td>Blocked bidiagonalization</td> + </tr> + + <tr> <td>JacobiSVD (two-sided)</td> <td>-</td> <td>Slow (but fast for small matrices)</td> diff --git a/doc/TutorialLinearAlgebra.dox b/doc/TutorialLinearAlgebra.dox index cb92ceeae..a72724143 100644 --- a/doc/TutorialLinearAlgebra.dox +++ b/doc/TutorialLinearAlgebra.dox @@ -73,7 +73,7 @@ depending on your matrix and the trade-off you want to make: <td>ColPivHouseholderQR</td> <td>colPivHouseholderQr()</td> <td>None</td> - <td>++</td> + <td>+</td> <td>-</td> <td>+++</td> </tr> @@ -86,6 +86,14 @@ depending on your matrix and the trade-off you want to make: <td>+++</td> </tr> <tr class="alt"> + <td>CompleteOrthogonalDecomposition</td> + <td>completeOrthogonalDecomposition()</td> + <td>None</td> + <td>+</td> + <td>-</td> + <td>+++</td> + </tr> + <tr class="alt"> <td>LLT</td> <td>llt()</td> <td>Positive definite</td> @@ -102,14 +110,23 @@ depending on your matrix and the trade-off you want to make: <td>++</td> </tr> <tr class="alt"> + <td>BDCSVD</td> + <td>bdcSvd()</td> + <td>None</td> + <td>-</td> + <td>-</td> + <td>+++</td> + </tr> + <tr class="alt"> <td>JacobiSVD</td> <td>jacobiSvd()</td> <td>None</td> - <td>- -</td> + <td>-</td> <td>- - -</td> <td>+++</td> </tr> </table> +To get an overview of the true relative speed of the different decompositions, check this \link DenseDecompositionBenchmark benchmark \endlink. All of these decompositions offer a solve() method that works as in the above example. @@ -183,8 +200,11 @@ Here is an example: \section TutorialLinAlgLeastsquares Least squares solving -The most accurate method to do least squares solving is with a SVD decomposition. Eigen provides one -as the JacobiSVD class, and its solve() is doing least-squares solving. +The most accurate method to do least squares solving is with a SVD decomposition. +Eigen provides two implementations. +The recommended one is the BDCSVD class, which scale well for large problems +and automatically fall-back to the JacobiSVD class for smaller problems. +For both classes, their solve() method is doing least-squares solving. Here is an example: <table class="example"> diff --git a/doc/examples/TutorialLinAlgSVDSolve.cpp b/doc/examples/TutorialLinAlgSVDSolve.cpp index 9fbc031de..f109f04e5 100644 --- a/doc/examples/TutorialLinAlgSVDSolve.cpp +++ b/doc/examples/TutorialLinAlgSVDSolve.cpp @@ -11,5 +11,5 @@ int main() VectorXf b = VectorXf::Random(3); cout << "Here is the right hand side b:\n" << b << endl; cout << "The least-squares solution is:\n" - << A.jacobiSvd(ComputeThinU | ComputeThinV).solve(b) << endl; + << A.bdcSvd(ComputeThinU | ComputeThinV).solve(b) << endl; } |