# Creating a special S3 remote to hold files shareable by URL In this example, I'll assume you'll be creating a bucket in Amazon S3 named $BUCKET and a special remote named public-s3. Be sure to replace $BUCKET with something like "public-bucket-joey" when you follow along in your shell. Set up your special [[S3 remote|special_remotes/S3]] with (at least) these options: git annex initremote public-s3 type=s3 encryption=none bucket=$BUCKET exporttree=yes public=yes encryption=none Then export the files in the master branch to the remote: git annex export master --to public-s3 You can run that command again to update the export. See [[git-annex-export]] for details. Each exported file will be available to the public from `http://$BUCKET.s3.amazonaws.com/$file` Note: Bear in mind that Amazon will charge the owner of the bucket for public downloads from that bucket. # Indexes By default, there is no index.ntml file exported, so if you open `http://$BUCKET.s3.amazonaws.com/` in a web browser, you'll see an XML document listing the files. For a nicer list of files, you can make an index.html file, check it into git, and export it to the bucket. You'll need to configure the bucket to use index.html as its index document, as [explained here](https://stackoverflow.com/questions/27899/is-there-a-way-to-have-index-html-functionality-with-content-hosted-on-s3). # Old method To use `git annex export`, you need git-annex version 6.20170909 or newer. Before we had `git annex export` an [[old_method]] was used instead.