aboutsummaryrefslogtreecommitdiff
path: root/doc/bugs/committing_an_edited_file_fails_with___34__hGetBuf__58___Invalid_argument__34__/comment_1_e2bc83fc04641c3547aeb1830da0d947._comment
blob: 4046ad7bbab7b2431a339662fdd06e4ff7a48b73 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
[[!comment format=mdwn
 username="joey"
 subject="""comment 1"""
 date="2016-10-05T17:02:12Z"
 content="""
By using git commit -a to commit changes to a large annexed file, you
are causing git to first add the file to the git repository, and then
git-annex has to go fix up and convert it back to an annexed file.

So, you probably don't want to be doing that, irrespective of this bug.
Storing the file content in the git repository will waste disk space until
git gc gets around to cleaning it up.

Instead, use `git annex add` on the file after editing it, and then commit
the result. As well as not cluttering up git with large unused objects,
that will be generally faster, and will probably avoid this bug.

----

I've reproduced on linux a behavior that probably has the same root cause.
It looks like git-annex pre-commit is reading the whole content of the
large file from git cat-file, and buffering it in memory. Of course
this uses a lot of memory and will fail for some size files, and it
should definitely not be doing this.

Suspect this is a reversion caused by the changes to support v6 mode.
"""]]