This tutorial will make use of Amazon’s S3 (Storage) service and Amazon S3Tools (Linux Client). A bash script will be created to automate the backup of files to S3.

1. Sign into Amazon AWS Console

2. Create a new user for Amazon S3 (Identity and Access Management)

3. Click on the newly created username > Permissions > Attach Policy – AmazonS3FullAccess > Download the credentials

4. Install Amazon S3Tools on linux server:

apt-get update
apt-get install python-setuptools
cd home
tar -xvf s3cmd-1.6.1.tar.gz
cd s3cmd-1.6.1
python install

5. Now configure s3cmd to include your AWS credentials (from /home/s3cmd-1.6.1 directory)

s3cmd –configure

  • Enter Access Key (downloaded above)
  • Enter Secret Key (downloaded above)
  • Enter encryption password – Optional (recommended)
  • Path to GPG program – leave as default
  • Use HTTPS – again optional
  • HTTP Proxy – enter if required by your network
  • Test to confirm working
  • Save

6. Create a new bucket on S3 to house the backups

/home/s3cmd-1.6.1/s3cmd mb s3://

6. Create a script to sync data to S3 using cron to automate. Enter the following:

echo “” >> backup.log
/home/s3cmd-1.6.1/s3cmd sync –skip-existing –recursive /path/to/backups/ s3://$(date +%F–%T)/ >> backup.log
mail -s “Server Name Backup Log” < backup.log
rm -rf backup.log

*The bash script will sync the files inside the directory /path/to/backups to the S3 bucket.
**/servername/ is the destination directory
***/$(date +%F–%T)/ a new directory will be created inside /servername/ with a date/timestamp
***Log of data transfer then emailed to specified email address

chmod +x

7. Now add a cronjob to automate e.g.

21 11 * * * /path/to/

Useful S3cmd commands:

  • s3cmd ls – lists all buckets
  • s3cmd ls s3:// – lists contents of bucket
  • s3cmd del s3:// – deletes
  • s3cmd del s3:// – deletes directory name – Directory must be empty to delete
  • s3cmd get –recursive s3:// – downloads directory from S3

Credit goes to S3cmd tools website –

Written by Matt Cooper
Hi, I'm Matt Cooper. I started this blog to pretty much act as a brain dump area for things I learn from day to day. You can contact me at: