Do not neglect website backups
Whether you are a blogger making some money through ads and affiliate links, a distributed development team using a hosted server for your code repository and your app’s client data, or an e-commerce company with a WooCommerce installation on your WordPress, the data on your server is your most precious commodity.
If it’s gone, you’re gone. The likelihood of serious harm to your reputation, your search engine ranking, or your financial health increases with every hour that your site can’t be reached.
In their State of the Channel Ransomware Report, Datto found that 96% of businesses with a backup plan in place fully recover from ransomware attacks. So don’t be one of the 58% of small businesses that don’t have a data-recovery plan. We will look at some of the options
Not all data losses are the same
It is important to keep in mind the different kinds of trouble that can crop up on a webserver. The types of backups you set up will vary to meet these challenges. Data center outages can be attributed in roughly equal parts to physical failures, cybercrime, and human error.
Physical failure is hard to fathom in 2020, but it still happens. 140,000 hard drives die every year in the United States. Bad weather, fires, and floods still bring data centers down. A cloud backup service with strong redundancy is an effective way of spreading your eggs between several baskets. Check the offerings carefully for details on their data centers.
While ransomware and many other kinds of malware are often thought of as a risk on workstations, servers can be hijacked to serve as distribution hubs for nasty code, or be loaded with surreptitious cryptocurrency mining programs, or be defaced just for the hell of it.
Recovering from such an attack can involve reinstalling the OS and all your software from scratch. It will speed things up if you have a disk image of the clean install from before you added files and data. The OS and software shouldn’t be part of your regular backups.
Common human-error disasters include misconfiguring databases, uploading buggy code, or changing to an incompatible template in your content management system. It is a good idea to make a manual backup that is easy to restore before any of the following:
- major system reconfigurations
- testing new code
- running a new program or library
- switching to a new template
The old backup 3-2-1: using copies, formats, and locations
An old rule of thumb in the backup strategy is 3-2-1:
- 3 Copies
- 2 Formats
- 1 Off-Site Location
The two on-site copies are the active primary file and one copy that gets deposited nightly on a separate disk drive–not exactly applicable to data on a webserver.
The off-site copy is replaced less frequently and might be on USBs or DVDs stored in someone’s home safe. Or it might be a very regularly updated backup in the cloud. In this case, your second local copy and the cloud copy may technically be in the same format, typically a hard drive or solid-state drive.
Does this violation of the ‘2 Formats’ rule really matter? The rule arose from the need to protect the copy from particularly pernicious worms that can travel from media to media in the same format but are incapable of jumping across the gap to a different format.
It is also a guarantee that a catastrophic environmental condition that might take out one form of storage could leave another unaffected: magnetic pulses or floods might fry disk drives but spare optical media such as CD/DVD.
With your data backed up in the cloud, the vulnerabilities shift to passwords that are too easy to guess, faulty configuration on your end, or insufficient security from the cloud service. The latter can be tricky to forecast. A service might seem to be sturdy and reputable until the first breach.
The modern dilemma: server vs. services
In a pragmatic, modern approach to organizing your backups, there are two fundamental options. You can backup a server, or backup through services.
Should you consider the server as a whole, as a single entity that needs to be backed up? You can create automated, regular backups that are duplicates of all the files in certain directories, hopefully covering all the important types of files saved on the machine: mailboxes, websites, code repositories, database files.
How do you know which files you don’t need to include? This selection is easier to make on Linux than on Windows thanks to a clearer separation of user data and system files. Also, read the fine print on any system you intend to use. Some will skip files that are “in use.” What does this mean on a server?
It definitely includes files that are open for writing. That is not the case for most of your mailboxes or your images or your static web pages, but it can be the case for database files, including the MySQL data files where your WordPress or Drupal content. So be careful because your backups might be missing essential business data.
Do you take a subsystem by subsystem approach? Take the mailbox, figure out how big you need it to be, and then look around at free cloud-based email solutions. Some of them have free plans that might be enough to store a few years’ worth of email for a couple of users. You could farm out email archives in this way.
Next, you would look at all the media files you generate. Some very big companies give you storage for free on the first 10 GB of your online drive. Glued together with gallery scripts or WordPress extensions, that could cover your image backup needs.
Backups for WordPress
If WordPress is your blogging platform, you will have the choice of numerous backup plugins. Recovery is handled by their scripts as well and generally just works. Wix and Squarespace do not offer nearly the same convenience or flexibility. Both require hard choices and awkward manual steps.
The next step to tackle is backing up your databases. Database engines usually include utilities to create a snapshot of the data. You can set it to run nightly and store the snapshot on a local hard drive.
And on down the line, until you have cobbled together many solutions into a hairball of thrifty craftiness. At least you are not at the mercy of one mega-restore process. But that’s a lot of accounts and logins and potential procedures to keep track of.
The advantage is that each one does what it addresses very well. Image files don’t need to reflect the same kinds of streams of small updates that database files do. With email, it’s comforting to know that your backup can be made operational at the flip of a software switch.
The classic approach
There is a school of thought that strongly believes that backup shouldn’t be parcelled out to various outside services. Instead, it needs to be part of the infrastructural bedrock, woven into a daily routine through strict automation. This is exactly what Namecheap’s AutoBackup can provide, and it is free with premium shared hosting plans, Stellar Plus, and Stellar Business Plan.
Ideally, with this kind of backup you should end up with functionality similar to Time Machine on Mac OS: a file explorer through time. It should give you the ability to compare past states of the system, to trace versions of the files back through time, and to recover either lost files individually or snapshots of the whole directory tree. With Namecheap’s AutoBackup, you can restore to any day of the last week, any week of the last month, and any month of the last year.
If your system relies on physical media, then you need a dependable person making sure the disks or sticks are handled properly, week in, week out. Taking the media offsite is overkill if you also have a cloud backup. But keeping your disk drive or your USB stick in a fireproof safe is not a bad idea. If you don’t have one yet, securing your backups is an extra argument in favor of making the outlay.
Getting your site back up
Did you know that 60% of backups are incomplete and 50% of restores fail? Practice restoring your server. The gold standard for restoration testing is four times per year. That requires some serious planning and a team dedicated to making it as automatic as possible. If you go through the whole process once as a trial run, you are already ahead of most people.
Make sure you understand what the restore options are from the different providers.
Restoring from the cloud can take much longer than you think.
Tests confirm that a gigabyte takes on the order of 10 minutes to download, roughly the same as the upload speed. Speeds vary from provider to provider, and they should be forthcoming with that information in the form of some kind of guarantee.
But that’s not the whole story when it comes to how fast you can restore from backup. To restore your 50 GB of email, artwork, clips, web content, and database files, you may have to download 500 GB of differential backups, the change instructions for every save or delete on disk. So the effective time to restore 50 GB from incremental backups can be hours.
If you sell things online, chances are you manage many aspects of your e-commerce shop on your server: billing, receipts, shipping, inventory, customer relations. These all create database tables, configuration files, and critical reports. Lose these and you could end up in legal trouble too.
If you run a little side business hosting sites for small businesses or friendly organizations, you can get by with a mid-range Linux VPS. But all your development work is on there, in addition to your clients’ lifeline to the world. Losing it could be devastating.
Look at the cost of backup solutions versus the risk of being hacked or of other unforeseen events. For a few dollars a month you can ensure your website will be back online quickly in case of trouble. That might be worth it for the peace of mind alone.