diff options
author | abeer@84e240dd227ea2c30fbdb4ad3c55439ccd54d390 <abeer@web> | 2016-09-03 00:57:40 +0000 |
---|---|---|
committer | admin <admin@branchable.com> | 2016-09-03 00:57:40 +0000 |
commit | 2f1e6d34d78d3e66af6dd47cfb00be8e32d8233c (patch) | |
tree | 95c23faab561408c08480ae37fa366df1dae2b3e | |
parent | 75a09069cf527915c6102f7d69bbd31450e2aa1c (diff) |
-rw-r--r-- | doc/forum/Large_Uploads_to_S3__63__.mdwn | 12 |
1 files changed, 12 insertions, 0 deletions
diff --git a/doc/forum/Large_Uploads_to_S3__63__.mdwn b/doc/forum/Large_Uploads_to_S3__63__.mdwn new file mode 100644 index 000000000..a72df425f --- /dev/null +++ b/doc/forum/Large_Uploads_to_S3__63__.mdwn @@ -0,0 +1,12 @@ +I set up a new git annex repo with an S3 remote. Uploading small files works file, but the process fails on larger files (>1 GB) with the following error. + + copy prosper/loaninfo.p (checking s3...) (to s3...) + 99% 10.7MB/s 0s + S3Error {s3StatusCode = Status {statusCode = 400, statusMessage = "Bad Request"}, s3ErrorCode = "RequestTimeout", s3ErrorMessage = "Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed.", s3ErrorResource = Nothing, s3ErrorHostId = Just "< a base64 encoded string>", s3ErrorAccessKeyId = Nothing, s3ErrorStringToSign = Nothing, s3ErrorBucket = Nothing, s3ErrorEndpointRaw = Nothing, s3ErrorEndpoint = Nothing} + +I tried these different options while setting up remote, but nothing worked. +partsize=1GiB +partsize=400MiB +chunk=100MiB + +What am I doing wrong? Should I try an even smaller chunk sie |