We are sure that your startup or company, like many others, has many difficult moments in its activities, and we thought that you shouldn’t add data loss to them. After all, any modern storage media is imperfect. Depending on the type, each of them has its own production time before failure. The only question is when it will happen. And don’t forget about other factors that can also disable your storage. All this often leads to an irreversible loss of valuable information. Imagine how your dear brainchild ceases to exist due to an annoying mistake and lack of backups. The way out of the situation is to create backups in advance and according to a number of simple rules and recommendations. We will talk about them in this article.
So, what are the causes of system failure? There are actually a lot of them, but the most common are the following:
- Computer hard drive failure.
- Accidental deletion of data (human factor).
- Infection with malware (for example, ransomware virus), targeted hacker attack.
- Compromised employee credentials.
- Phishing attacks.
- Physical theft of equipment.
- Natural disasters, fires, other force majeure circumstances.
At the same time, statistics show that 44% of companies that experience data loss are closed within 6 months. But this can be avoided by having a backup and the place in which it is stored. Most backups are stored:
- in physical media. It can be either a flash drive or a physically isolated data storage. It all depends on the degree of importance of the information and its volume.
- in cloud media. A relatively new way. Cloud technologies allow you not to worry about data safety, because all of them are stored on the servers of cloud service providers, which are often pretty well protected and reliable. Examples: Amazon S3, Google Drive, DropBox.
Best Data Backup Strategy
Copying physical or virtual files or databases to an additional place for their recovery in the event of a failure will allow you to get out of difficult situations. It's perfect when data is copied at the development stage just in case. However, best backup practices include a 3-2-1 strategy. This means that organizations must have THREE copies of data in TWO different formats, with ONE copy not being in the office (for example, in the cloud).
- Copy No. 1. Local copy. Employees may continue to use these files for regular work.
- Copy No. 2. Local backup. Stored data can be physically accessed locally.
- Copy No. 3. A copy stored outside the office, such as in the cloud or remote storage. If something happens to the office premises or equipment where the backup files are located, they will remain safe.
How to make a good data backup
When building a backup strategy, you need to pay attention to two important points:
- Correctly determine which information needs to be saved. A copy should be made if data loss really hinders business. While it’s easy to reinstall the program, it’s almost impossible to restore transaction details or business correspondence without a data backup.
- Back up data on a regular basis, test disaster recovery. You must systematically make backup copies of all files that you create or modify. It is best to set up an automatic backup that runs regularly, periodically run tests of the disaster recovery plan. A good system administrator will help you with this. Access to changes to such copies is also best provided only directly to technical specialists.
Practical tips for creating data backup
.1 Data backup should always be made
No matter how technologies evolve, the good old backup will never lose its value, at a difficult moment saving us nerve cells, time, salary, as well as sedatives. It allows us not to panic, to act deliberately, assuming a reasonable risk. Even if all components in your server are duplicated, and the data is on an expensive array with redundancy, drive away a false sense of security from yourself. No one is immune from logical errors and the human factor.
Imagine a situation where a negligent programmer accidentally enters the rm -rf command in the root directory. In case you didn’t know, this action deletes absolutely all the files stored in it. But how incompetent one has to be to allow such a mistake? Ask the staff at Pixar Animation Studio. Their famous cartoon “Toy Story 2” was deleted this way. Twice.
.2 The data backup should be automatic
Only an automated backup that runs on schedule gives us the opportunity to recover relatively relevant data - for example, from yesterday, and not from March. Backup should also be done before any potentially dangerous operations, either its hardware upgrades, microcode updates, installing patches, data migration, etc.
.3 Recovery from data backup is an extreme measure
It is necessary to resort to recovery when there are no other chances. Because it always happens slowly, in a hurry and with a considerable amount of stress. It’s better not to restore to the original place on the disk so as not to overwrite the original data.
.4 The backup must be stored separately from the data and for at least 2 weeks
This is the recommended time period so that even a sluggish accountant has time to come to his senses that something is missing or spoiled. But you can store longer if space allows you. Traditionally, magnetic tape was used to store backups due to its cheapness, but nowadays, disk and cloud storages are in full use.
.5 It is useful to duplicate backup to a remote site
Man-made disasters, blackouts in the whole city, a zombie invasion and other troubles lie in wait for business on every corner. Zombies remain especially poorly understood. Therefore, it is good practice to have a remote site where, in one way or another, backups fall.
How can this be organized? You may simply store the backup somewhere in the desk drawer or build/outsource a full-fledged backup data center with a good channel. It all depends on your willingness and money. In the second case, backups can be automatically duplicated using modern backup tools. For example, having a Symantec NetBackup Appliance at each site, you can get a site where backups from the main site go through Automatic Image Replication (AIR) technology.
.6 Every successful technical
It is impossible to create a consistent copy of data without stopping work: information is constantly changing, part of it is in buffers, some in RAM. Serious industrial-grade products, such as Symantec NetBackup, EMC Networker, CommVault Simpana and others, have a wide range of agents for working with various business applications. These agents can put the application into a mode when the buffer is flushed to disk, and the data files stop changing for a while. Otherwise, the server will be idle, which is rather expensive to handle.
.7 Minimize the load on the main system
In order not to keep the application for a long time in such an operation mode, this technique can be combined with the creation of instant snapshots of the data. A snapshot is created quickly, so the application itself can be "released", and you can copy the consistent data. To create snapshots their agents are used, which can also be part of the backup software.
If you create not just a data snapshot, but a clone, then you can disconnect it from the original data disk. On another host, the backup program will see this data and will transfer it to the backup storage. This technique is called Offhost backup.
There are many other nuances, we cannot cover everything in one article. The main commandment - pay data backup enough attention, don’t put it off for later. And try not to make it so that it was once needed. Remember, the development of such preparations equally lies with both the owner of the company and the developers who implemented it. But this is far from the only thing worth remembering and if everything does not fit in your head, perhaps you should contact the specialists for detailed consulting.