summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorGravatar https://id.koumbit.net/anarcat <https://id.koumbit.net/anarcat@web>2015-06-11 17:16:54 +0000
committerGravatar admin <admin@branchable.com>2015-06-11 17:16:54 +0000
commitba8639979f6bb89b387eb2438603d086ca3659f6 (patch)
treee36b60a7b2764568b3011ceebb494d75e278592d
parentbd684fccffe82574d3d8df935a4ee253f848a691 (diff)
bandwidth limits on s3
-rw-r--r--doc/forum/s3_bandwidth_limitations_and_next_release.mdwn7
1 files changed, 7 insertions, 0 deletions
diff --git a/doc/forum/s3_bandwidth_limitations_and_next_release.mdwn b/doc/forum/s3_bandwidth_limitations_and_next_release.mdwn
new file mode 100644
index 000000000..c1e67a8c0
--- /dev/null
+++ b/doc/forum/s3_bandwidth_limitations_and_next_release.mdwn
@@ -0,0 +1,7 @@
+Is there a way to set bandwidth limits for [[special_remotes/s3]]?
+
+From what i can see in the [[todo/credentials-less_access_to_s3]] patch, the `downloadUrl` function is used, does that mean that the `annex.web-download-command` is used? If that's the case, it's great because it means we can use the `--bwlimit` parameter in `wget` to limit transfers.
+
+But what about uploads? Are there limits there as well?
+
+I'll also abuse this forum to see if/when it will be possible to have a shiny new release to ship that amazing new feature? There seems to be sufficient stuff piled up in the unreleased changelog to warrant a release, no? :) --[[anarcat]]