summaryrefslogtreecommitdiff
path: root/doc
diff options
context:
space:
mode:
authorGravatar Joey Hess <joeyh@joeyh.name>2016-02-08 11:00:58 -0400
committerGravatar Joey Hess <joeyh@joeyh.name>2016-02-08 11:00:58 -0400
commitbb214e6ca1ecbeaab8280c20c48ac7c6fcd13770 (patch)
tree07d9347eb2de908f6e2e05ef5e140097c3623f01 /doc
parente7fe28f47997ad098de5489496cf866f3c644fa8 (diff)
parent35abf4c5c74ab2358a0e1cf9a21eaca97da00a3b (diff)
Merge branch 'master' of ssh://git-annex.branchable.com
Diffstat (limited to 'doc')
-rw-r--r--doc/bugs/Adding_torrent_via_addurl_fails.mdwn54
-rw-r--r--doc/devblog/day_360__results_of_2015_user_survey/comment_1_5f30c583cd20c33e2bcd386f674d6f3b._comment9
-rw-r--r--doc/devblog/day_360__results_of_2015_user_survey/comment_2_68479af6dda2f732436b19160297aacd._comment7
-rw-r--r--doc/forum/Annex_keeps_dropping_content.mdwn9
-rw-r--r--doc/forum/Backing_up_photos_to_the_cloud.mdwn13
5 files changed, 92 insertions, 0 deletions
diff --git a/doc/bugs/Adding_torrent_via_addurl_fails.mdwn b/doc/bugs/Adding_torrent_via_addurl_fails.mdwn
new file mode 100644
index 000000000..2a6b3f442
--- /dev/null
+++ b/doc/bugs/Adding_torrent_via_addurl_fails.mdwn
@@ -0,0 +1,54 @@
+### Please describe the problem.
+Adding a magnet link via addurl fails after downloading the torrent metatdata if the "announce" field of the torrent is empty
+
+### What steps will reproduce the problem?
+ git annex addurl "magnet:?xt=urn:btih:88066b90278f2de655ee2dd44e784c340b54e45c"
+
+
+### What version of git-annex are you using? On what operating system?
+git-annex version: 6.20160126
+archlinux
+
+### Please provide any additional information below.
+I have traced back the Problem to the parsing of the torrent metatdata.
+Since you also seem to be the author of the haskel-torrent parser I felt it is apropriate to post here.
+
+The above magnet link (an Archlinux Iso) results in a .torrent file that has no "announce" entry. Instead it only has the entry "announce-list" with multiple urls.
+This causes the parser to fail.
+I dont know if having only "announce-list" horribly violates some standard, however a second magnet link that i tried showed the same behaviour so this might not be an unusual case.
+
+I was able to put in a workarround in btshowmetainfo.py to set "annonuce" to the first entry from "announce-list" if it wasn't defined.
+My git-annex binary is compiled with the haskel parser enabled do this doesn't change annexs' behaviour.
+
+It's not a big dealbreaker for me, just playing arround with the torrent feaure for now.
+
+[[!format sh """
+# If you can, paste a complete transcript of the problem occurring here.
+# If the problem is with the git-annex assistant, paste in .git/annex/daemon.log
+git annex addurl "magnet:?xt=urn:btih:88066b90278f2de655ee2dd44e784c340b54e45c"
+(downloading torrent file...)
+
+02/07 16:42:13 [NOTICE] IPv4 DHT: listening on UDP port 6964
+
+02/07 16:42:13 [NOTICE] IPv4 BitTorrent: listening on TCP port 6927
+
+02/07 16:42:13 [NOTICE] IPv6 BitTorrent: listening on TCP port 6927
+[#96c5b2 27KiB/27KiB(100%) CN:11 SD:2]
+02/07 16:42:32 [NOTICE] Download complete: [METADATA]88066b90278f2de655ee2dd44e784c340b54e45c
+
+02/07 16:42:32 [NOTICE] Saved metadata as ../../.git/annex/misctmp/URL--magnet&c,63xt,61urn&cbtih&c88066b90278f2de655ee2dd44e784c340b54e45c/meta/88066b90278f2de655ee2dd44e784c340b54e45c.torrent.
+
+Download Results:
+gid |stat|avg speed |path/URI
+======+====+===========+=======================================================
+96c5b2|OK | 0B/s|[MEMORY][METADATA]88066b90278f2de655ee2dd44e784c340b54e45c
+
+Status Legend:
+(OK):download completed.
+git-annex: failed to parse torrent: Name not found in dictionary: announce
+# End of transcript or log.
+"""]]
+
+### Have you had any luck using git-annex before? (Sometimes we get tired of reading bug reports all day and a lil' positive end note does wonders)
+
+
diff --git a/doc/devblog/day_360__results_of_2015_user_survey/comment_1_5f30c583cd20c33e2bcd386f674d6f3b._comment b/doc/devblog/day_360__results_of_2015_user_survey/comment_1_5f30c583cd20c33e2bcd386f674d6f3b._comment
new file mode 100644
index 000000000..4c53ea41b
--- /dev/null
+++ b/doc/devblog/day_360__results_of_2015_user_survey/comment_1_5f30c583cd20c33e2bcd386f674d6f3b._comment
@@ -0,0 +1,9 @@
+[[!comment format=mdwn
+ username="anarcat"
+ subject="comment 1"
+ date="2016-02-05T23:06:11Z"
+ content="""
+thanks for the review! great to see those surveys every once in a while...
+
+i wonder if [[todo/build_a_user_guide/]] could help with the documentation. the [[todo]] and [[forum]] pages are useful, but they are more \"search\" based than a handbook would be. there is the [[walkthrough]], but it could cover a *lot* more ground and i am not sure the wiki format is appropriate for this...
+"""]]
diff --git a/doc/devblog/day_360__results_of_2015_user_survey/comment_2_68479af6dda2f732436b19160297aacd._comment b/doc/devblog/day_360__results_of_2015_user_survey/comment_2_68479af6dda2f732436b19160297aacd._comment
new file mode 100644
index 000000000..501e7b455
--- /dev/null
+++ b/doc/devblog/day_360__results_of_2015_user_survey/comment_2_68479af6dda2f732436b19160297aacd._comment
@@ -0,0 +1,7 @@
+[[!comment format=mdwn
+ username="CandyAngel"
+ subject="comment 2"
+ date="2016-02-08T13:08:30Z"
+ content="""
+I already have a cookbook of sorts (written in POD), but it is notes based on my usage. I'd be happy to expand this if I can get some feedback from newer/Windows users, as my usage is unusual and focused (e.g. no direct mode, limited assistant).
+"""]]
diff --git a/doc/forum/Annex_keeps_dropping_content.mdwn b/doc/forum/Annex_keeps_dropping_content.mdwn
new file mode 100644
index 000000000..d3ff813f8
--- /dev/null
+++ b/doc/forum/Annex_keeps_dropping_content.mdwn
@@ -0,0 +1,9 @@
+I'm experiencing a strange behaviour...
+
+I have a few annexes (local, ssh, gitolite) in my group, some of them are "manual standard" and some are "backup".
+
+If I 'get' files on my laptop annex and then I 'sync --content' local files are dropped! Local is "manual standard".
+
+Am I doing something wrong?
+
+Thanks
diff --git a/doc/forum/Backing_up_photos_to_the_cloud.mdwn b/doc/forum/Backing_up_photos_to_the_cloud.mdwn
new file mode 100644
index 000000000..0af14cbb8
--- /dev/null
+++ b/doc/forum/Backing_up_photos_to_the_cloud.mdwn
@@ -0,0 +1,13 @@
+I'm using git annex to manage my photo collection. The main reason is because my laptop doesn't have enough space to store all my photos, so I'm using git annex to create a sort of split repository between my laptop (which has some photos) and an external drive (which has everything). So far this has worked well, I have around 15,000 photos which is around 40GB.
+
+Now I also want to see if I can use git annex to improve my backup workflow. Previously I've just exported albums from my photo manager (iPhoto on OS X), zipped them up, and uploaded them to S3. I have lifecycle rules setup so that they are automatically replicated to a different region and archived to Glacier (it's a lot easier than dealing with Glacier directly). I am using this as a last resort backup in case everything else is lost, so it doesn't matter if it takes a while to access. This works well, except on it's own I don't really know what photos are stored where, which is where I'm hoping git annex can help.
+
+I've tried using the S3 remote, but there are a few things which I don't like:
+
+1) If the git repository is lost I can't recover the original paths, so I won't know which photo belongs in which album. As this is a last resort backup, if I ever need to get anything from here it's likely that the git repository is also lost. [JGit supports storing Git repositories in S3](http://www.fancybeans.com/blog/2012/08/24/how-to-use-s3-as-a-private-git-repository/), but that seems like the wrong way to solve this, I'd prefer just to have the original folder structure maintained.
+
+2) As there are 15,000 photos, that means 15,000 requests to S3 to upload them and another 15,000 each time I check them. On my connection I can upload to AWS at around 5MB/s, but due to latency that only means one or two photos per second. I'd prefer to just upload archives.
+
+As I understand encryption + chunking with a sufficiently large size (say 100Mb) would help solve the second problem, but as this is a last resort backup I don't want to have to worry about encryption keys or passphrases.
+
+It looks like a wrapper around the [archivedrive feature](https://git-annex.branchable.com/tips/offline_archive_drives/) (which archives, zips and uploads it to S3) would do what I want, but I'm wondering if there is a better way?