From 8b17e33b84b0f60da422e2c32a5fa6f53c10df6d Mon Sep 17 00:00:00 2001 From: Joey Hess Date: Thu, 2 Apr 2015 01:23:43 -0400 Subject: Revert "Significantly sped up processing of large numbers of directories passed to a single git-annex command." This reverts commit 5492d8f4ddbd398e0188da9daed840908d1198c0. Whoops, git ls-files does not always output in the input ordering. That's why all this work is needed. Urk. --- doc/bugs/feeding_git_annex_with_xargs_can_fail.mdwn | 6 +----- 1 file changed, 1 insertion(+), 5 deletions(-) (limited to 'doc/bugs/feeding_git_annex_with_xargs_can_fail.mdwn') diff --git a/doc/bugs/feeding_git_annex_with_xargs_can_fail.mdwn b/doc/bugs/feeding_git_annex_with_xargs_can_fail.mdwn index ebbc1e7e9..c973308c6 100644 --- a/doc/bugs/feeding_git_annex_with_xargs_can_fail.mdwn +++ b/doc/bugs/feeding_git_annex_with_xargs_can_fail.mdwn @@ -5,13 +5,9 @@ Feeding git-annex a long list off directories, eg with xargs can have ls-files command is longer than the git-annex command often, so it gets truncated and some files are not processed. - > [[fixed|done]] --[[Joey]] + > fixed --[[Joey]] * It can take a really long time for git-annex to chew through the git-ls-files results. There is probably an exponential blowup in the time relative to the number of parameters. Some of the stuff being done to preserve original ordering etc is likely at fault. - - > I think I've managed to speed this up something like - > 1000x or some such. segmentPaths on an utterly insane list of 6 million - > files now runs in about 10 seconds. --[[Joey]] -- cgit v1.2.3