Pinterest Stumbleupon Whatsapp
Ads by Google

ssh backupBacking up your website or blog can be an expensive and arduous task, requiring a variety of plugins Updraft: The Simplest Wordpress Backup & Restore Utility Updraft: The Simplest Wordpress Backup & Restore Utility Read More , or additional plans from your hosting provider – but it needn’t be really. If you have SSH access to your website host (generally you would need at least a virtual private server The Various Forms Of Website Hosting Explained [Technology Explained] The Various Forms Of Website Hosting Explained [Technology Explained] Read More for this), then it’s easy to backup, restore and migrate your entire website with only a few commands. Let me show you how.

What is SSH Command Line?

SSH gives you the ability to talk directly to your web-server. It doesn’t give a pretty interface, or a nice GUI, just a straight-up powerful command line. This can be daunting to some people, but the sheer power, speed, and level of automation it provides can be an absolute life-saver and makes the process of migrating sites incredibly easy.

Most shared hosts unfortunately don’t allow SSH access to your account, at least not by default. If you’re hosting with GoDaddy, you can enable it though, so be sure to check first.

To log in via SSH, open up the Terminal in OS X (or get some free SSH software for Windows What SSH Is & How It's Different From FTP [Technology Explained] What SSH Is & How It's Different From FTP [Technology Explained] Read More ) and type in the following:

ssh username@yourdomain.com

You’ll be prompted for your password. If you’ve never used SSH before, you might be surprised when typing in your password doesn’t anything on screen. Don’t worry, that’s for security.

Once logged in, you’ll be presented with a command prompt, similar to the following:

Ads by Google
-bash-3.2$

This means everything is fine, so go ahead and continue with these commands.

Start by taking a look around and trying to navigate to your web directory. Type:

ls

To ‘list’ the current files and folders.

cd directoryname

to change to a directory. In this case, I’m going to navigate to the httpdocs directory, which is the root of my web site (where all my wordpress files are stored). You can then ‘ls’ again, just to be sure.

ssh backup

At this point, we’re ready to begin the SSH backup process.

Backing up the Database:

Since the majority readers will be doing this with a WordPress install, you will most certainly have a database to backup in addition to any files stored on the site. First off, you’ll need 3 bits of information to backup your database, but all can be found within wp-config.php (if you’re running wordpress, that is):

  • Database name
  • Database user
  • Database password

Then, issue this simple command, being sure to replace the username, table name, and backup filename where neccessary:

mysqldump --add-drop-table -u username -p tablename > backupfilename.sql

Hit enter, and enter your password. Once it’s run, you can then issue another ‘ls’ command to check that the file has been output. Congratulations, this is all the information in your database as a single SQL file, ready to backup or import somewhere else.

Note: I’ve assumed that your database server is running on the same server on which you are hosting. On a GoDaddy host however, the MySQL database is actually stored remotely on a separate server to which you don’t have SSH access. In cases like these, you will need to access PHPMyAdmin via the hosting control panel, but that is out of the scope of this tutorial.

Backing Up Files:

Now that we have the database stored to single file on the server, we can go ahead and backup both that and your website files down to a single compressed backup file. To do this, we are going to issue one simple command. You need only replace yourbackupfilename with whatever you want it to be called.

tar -vcf yourbackupfilename.tar .

Let me break that down. Tar is a common linux compression format, similar to zip but more efficient. -vcf are simple some options that say “make a new archive, and tell me what you’re doing”. Next is the name of the file we want to create, and finally a single period mark tells it to include everything. We could have written * instead, but this would miss any hidden files such .htaccess which is essential for WordPress.

That’s it. Once that’s run, you will have a single .tar file consisting of every file on your site. You could log in via FTP at this point and download it, but let me show one final step that allows you to restore all these files.

Restoring Everything:

Let’s say the worst has happened, and something has gone horribly wrong with your site. You’ve got a tar file of everything that you backed up last week, so now you’d like to restore it to that. First off, log in via FTP and upload the backup file onto your server. Perhaps you’ve been storing them in a special directory. Either way, move the latest complete backup file into the root of your site, and we’ll begin.

Start by unpacking all the files, the reverse of what we did to back them up:

tar -vxf yourbackupfilename.tar

The crucial difference here is in the -vxf switch, which tells it to extract the files instead of creating a new backup. Also, there is no period on the end of the command this time.

The last step is to suck your database back in to where it was before. Make sure you have a blank database setup with the same password and tablename as before, or you’ll need to change your site configuration settings too. To suck the data back in, issue this command:

mysql -u username -p tablename 

Next week: Automating Your Backups

That's enough to get you started with doing SSH backups for now, then next I'll show how to automate the task with a simple shell script and a CRON command. If you have some Amazon s3 storage space 4 Great Uses for Amazon's S3 Web Services 4 Great Uses for Amazon's S3 Web Services Despite Amazon being most well known for their retail services, they actually offer a host of web services for developers and home users that take advantage of Amazons experience and scalability with massive amounts of... Read More , I'll even show you how you can automatically upload your backup files to a storage bucket once they're done.

One last tip - when I first began to use the command line, this one really impressed me - try pressing the tab key when your typing in a long filename, and if the name is unique enough it will attempt to autocomplete the rest of the filename!

  1. Wolfgang
    August 26, 2016 at 9:03 pm

    You note that, "Tar is a common linux compression format....". However, tar does not compress, it only concatenates your files into one file. By adding the -z (gzip) or -j (bzip2) switches will you then compress the file. So to compress the command should be -vzcf.

  2. Karim Bya
    August 7, 2015 at 7:11 pm

    hello i have newbie question :)
    is this method will slow the time of loading my website ?
    if not, then even if i do live streaming ??

  3. Matthias
    February 2, 2015 at 10:17 am

    Thanks for this nice article. One little question: when I backup files, how can I skip folders from the tar-package?

    thanks!

    • James Bruce
      February 2, 2015 at 10:27 am

      tar --exclude='./folder' --exclude='./upload/folder2' -zcvf /backup/filename.tgz

      (exclude must be at the start of the command)

    • Matthias
      February 2, 2015 at 10:33 am

      Great! Saved my day. Thank you!

  4. Hemant
    December 12, 2014 at 4:36 am

    Thank James Bruce, for this simple and beautiful article

  5. Sil
    May 5, 2011 at 7:55 pm

    Thanks James, seriously appreciate it!

  6. Sil
    May 5, 2011 at 3:50 am

    Heya James, did you end up adding the Automated tutorial? Thanks for this one!

    • James Bruce
      May 5, 2011 at 7:47 am

      Not yet, there were a number of stories I wanted to get out first. Thanks for reminding me, I'll get on it today, so expect to be published next week sometime.

    • James Bruce
      May 5, 2011 at 9:07 am

      Whoops, my memory si seriously failing me. actually Sil, I have written it, and it scheduled to be published tomorrow. I did in fact write a while ago, but I've been pushing other stories in front of it in order to provide up to date info on a few pertinent issues. So anyway, watch out for it tmrw!

  7. James
    April 27, 2011 at 4:08 pm

    Guys you are the best, thank you very much for these tutorials!

  8. Joe
    April 19, 2011 at 4:11 am

    Thank you for this tutorial, James! I have struggled to backup my oversized WP db using phpmyadmin ... kept timing out ... thankfully PUTTY and your tutorial to the rescue! I can sleep now. Thank you so much.

    • James Bruce
      April 19, 2011 at 8:52 am

      You're welcome Joe, you're well on the way to super-high geekdom now!

      Last thing to do is ditch windows!

  9. Steve Mayne
    April 12, 2011 at 1:37 pm

    I have wrapped up similar techniques into a simple-to-use cloud service:
    http://www.backupmachine.com/
    There are free packages for those of you with smaller sites.

  10. Steve Mayne
    April 12, 2011 at 11:37 am

    I have wrapped up similar techniques into a simple-to-use cloud service:
    http://www.backupmachine.com/
    There are free packages for those of you with smaller sites.

  11. pceasies
    April 10, 2011 at 11:29 pm

    If you want to easily backup your website I'd recommend rsync. If your host has SSH setup they probably have rsync setup as well. If you're on a Windows machine you'll need cygwin you an rsync package to get it working.

    Cygwin
    Once you have it installed, open a cygwin command window
    Your drives are all under /cygdrive/ so /cygdrive/c/ is the same thing as C:
    This will backup everything on your server (in your home directory) to Q:\WebsiteBackups\
    rsync -avze ssh USERNAME@SSH-SERVER.com:~/ "/cygdrive/q/WebsiteBackups/"

    If you delete a file on your server and run that again, the file will still be backed up on your computer. You could make a shell script that automatically runs and makes a new folder based on the date and backs up the website to that (maybe once every 2 weeks if it's large, or once a week if it's smaller)

    Here's a sample that you could schedule to run every 2 weeks (cronjob or scheduled task):

    #!/bin/bash

    LOCATION='/cygdrive/c/backupDirectory'
    USERNAME='YourLoginName'
    SITE='YourSSHServer'

    year=`date +%Y`
    month=`date +%m`
    day=`date +%d`
    FOLDER="$LOCATION/$year/$month/$day/"
    mkdir -p $FOLDER
    rsync -avze ssh $USERNAME@$SITE:~/ "$FOLDER"

    Open up nano/joe/vi in cygwin and paste/type that in. Save it as "backup-site" or something similar
    Run chmod +x backup-site (or whatever you named it)

    Open up Task Scheduler
    Create a new task and select date/time
    For action use "C:\cygwin\bin\bash.exe --login -i -c ~/backup-site"

    Now your website should be automatically backed up. You may want to create a script that automatically deletes older versions so it won't take up too much space. Another alternative would be to use the same folder and empty it every time before it backs it up again, then use Acronis, or Norton Ghost to include that folder it incremental backups (this would save space).

  12. dogbert
    April 10, 2011 at 11:15 pm

    The > in the last line should be a <, like so:
    mysql -u username -p tablename < databasebackupfilename.sql

    • James Bruce
      April 11, 2011 at 1:59 pm

      Thanks dogbert, I'll correct the mistake in the post now.

  13. pceasies
    April 10, 2011 at 9:29 pm

    If you want to easily backup your website I'd recommend rsync. If your host has SSH setup they probably have rsync setup as well. If you're on a Windows machine you'll need cygwin you an rsync package to get it working.

    Cygwin
    Once you have it installed, open a cygwin command window
    Your drives are all under /cygdrive/ so /cygdrive/c/ is the same thing as C:
    This will backup everything on your server (in your home directory) to Q:WebsiteBackups
    rsync -avze ssh USERNAME@SSH-SERVER.com:~/ "/cygdrive/q/WebsiteBackups/"

    If you delete a file on your server and run that again, the file will still be backed up on your computer. You could make a shell script that automatically runs and makes a new folder based on the date and backs up the website to that (maybe once every 2 weeks if it's large, or once a week if it's smaller)

    Here's a sample that you could schedule to run every 2 weeks (cronjob or scheduled task):

    #!/bin/bash

    LOCATION='/cygdrive/c/backupDirectory'
    USERNAME='YourLoginName'
    SITE='YourSSHServer'

    year=`date +%Y`
    month=`date +%m`
    day=`date +%d`
    FOLDER="$LOCATION/$year/$month/$day/"
    mkdir -p $FOLDER
    rsync -avze ssh $USERNAME@$SITE:~/ "$FOLDER"

    Open up nano/joe/vi in cygwin and paste/type that in. Save it as "backup-site" or something similar
    Run chmod +x backup-site (or whatever you named it)

    Open up Task Scheduler
    Create a new task and select date/time
    For action use "C:cygwinbinbash.exe --login -i -c ~/backup-site"

    Now your website should be automatically backed up. You may want to create a script that automatically deletes older versions so it won't take up too much space. Another alternative would be to use the same folder and empty it every time before it backs it up again, then use Acronis, or Norton Ghost to include that folder it incremental backups (this would save space).

    • James Bruce
      April 11, 2011 at 2:03 pm

      thanks pceasies, that's a fantastic tutorial and something I was hoping to cover at a later date. I'll be sure to integrate your instructions for Windows users, as the process is a whole lot easier on OsX or to another hosting server.

  14. Kim Hjortholm
    April 10, 2011 at 6:21 pm

    For a joomla site Akeebabackup https://www.akeebabackup.com/ is an excellent solution,very simple setup and easy to use.

Leave a Reply

Your email address will not be published. Required fields are marked *