summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorGravatar Joey Hess <joeyh@joeyh.name>2015-04-23 15:11:08 -0400
committerGravatar Joey Hess <joeyh@joeyh.name>2015-04-23 15:11:08 -0400
commit814d0db290b7c50b8ef281e0940eb3e6787c9e48 (patch)
treef67f3fe94a792aebb1b0c09b727dc78732cf7fda
parent3af7d8c2b67b1929eea35cbee9fb3498217ac047 (diff)
parent4bbadf8d789c9b4d060e590e427d5b0f3fe78923 (diff)
Merge branch 'master' of ssh://git-annex.branchable.com
-rw-r--r--doc/bugs/Can__39__t_get_content_from_S3_with_s3-aws_library/comment_8_13f862524d4aa503fc998ede41617942._comment12
-rw-r--r--doc/bugs/Can__39__t_get_content_from_S3_with_s3-aws_library/comment_9_769de1e47221dfb6c810665e3704bbb2._comment10
2 files changed, 22 insertions, 0 deletions
diff --git a/doc/bugs/Can__39__t_get_content_from_S3_with_s3-aws_library/comment_8_13f862524d4aa503fc998ede41617942._comment b/doc/bugs/Can__39__t_get_content_from_S3_with_s3-aws_library/comment_8_13f862524d4aa503fc998ede41617942._comment
new file mode 100644
index 000000000..531e8c79d
--- /dev/null
+++ b/doc/bugs/Can__39__t_get_content_from_S3_with_s3-aws_library/comment_8_13f862524d4aa503fc998ede41617942._comment
@@ -0,0 +1,12 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnSenxKyE_2Z6Wb-EBMO8FciyRywjx1ZiQ"
+ nickname="Walter"
+ subject="comment 8"
+ date="2015-04-23T19:02:18Z"
+ content="""
+Ok, so I did ``git annex enableremote cloud datacenter=ap-southeast-2``, and can now get files properly. So from that point of view it now works. And I guess that provides an easy way to reproduce (just set a working S3 remote to the wrong datacenter). I'm prepared to accept that this was something I did somehow (at some point I manually moved files from one S3 bucket (actually account) to another, but it seems that git-annex would have created the bucket, so I'm not sure how the datacenter could be wrong.)
+
+In any case, now I'm not sure exactly which files did get uploaded properly, so will run a ``fsck``. I guess it would be good to either return an error when this happens, or follow the redirect.
+
+Also, I really appreciate the quick response you have to bugs!
+"""]]
diff --git a/doc/bugs/Can__39__t_get_content_from_S3_with_s3-aws_library/comment_9_769de1e47221dfb6c810665e3704bbb2._comment b/doc/bugs/Can__39__t_get_content_from_S3_with_s3-aws_library/comment_9_769de1e47221dfb6c810665e3704bbb2._comment
new file mode 100644
index 000000000..788ccf7ed
--- /dev/null
+++ b/doc/bugs/Can__39__t_get_content_from_S3_with_s3-aws_library/comment_9_769de1e47221dfb6c810665e3704bbb2._comment
@@ -0,0 +1,10 @@
+[[!comment format=mdwn
+ username="https://www.google.com/accounts/o8/id?id=AItOawnSenxKyE_2Z6Wb-EBMO8FciyRywjx1ZiQ"
+ nickname="Walter"
+ subject="comment 9"
+ date="2015-04-23T19:07:56Z"
+ content="""
+Is it possible to do a fast ``fsck`` on an S3 remote? Because I don't want to download all the files again, it would be nice to just have the option to check it it exists.
+
+I get a ``failed to download file from remote`` error when I try it.
+"""]]