From 6e946b9a39f8e5ba55651accbc1307e8cce5c4e2 Mon Sep 17 00:00:00 2001 From: Joey Hess Date: Sat, 12 Nov 2011 14:24:14 -0400 Subject: add --- doc/todo/optimise_git-annex_merge.mdwn | 13 +++++++++++++ 1 file changed, 13 insertions(+) create mode 100644 doc/todo/optimise_git-annex_merge.mdwn (limited to 'doc') diff --git a/doc/todo/optimise_git-annex_merge.mdwn b/doc/todo/optimise_git-annex_merge.mdwn new file mode 100644 index 000000000..a2cdfb15f --- /dev/null +++ b/doc/todo/optimise_git-annex_merge.mdwn @@ -0,0 +1,13 @@ +Typically `git-annex merge` is fast, but it could still be sped up. + +`git-annex merge` runs `git-hash-object` once per file that needs to be +merged. Elsewhere in git-annex, `git-hash-object` is used in a faster mode, +reading files from disk via `--stdin-paths`. But here, the data is not +in raw files on disk, and I doubt writing them is the best approach. +Instead, I'd like a way to stream multiple objects into git using stdin. +Sometime, should look at either extending git-hash-object to support that, +or possibly look at using git-fast-import instead. + +`git-annex merge` also runs `git show` once per file that needs to be +merged. This could be reduced to a single call to `git-cat-file --batch`, +There is already a Git.CatFile library that can do this easily. --[[Joey]] -- cgit v1.2.3