When you think of disaster recovery, data quality is likely not the first thing that comes to mind. But data quality should factor prominently into your disaster recovery plan.
Forming a Disaster Recovery Plan
Your plan should:
- Identify all data sources that need to be backed up so that they can be recovered in the event of a disaster.
- Specify a method or methods for backing up the data.
- Identify how frequently backups should occur.
- Determine whether on-site data backups are sufficient for your needs, or if you should back up data to a remote site (in case your local infrastructure is destroyed during a disaster).
- Specify who is responsible for performing backups, who will verify that backups were completed successfully and who will restore data after a disaster.
If you need help building and implementing a disaster recovery plan, you can find entire companies dedicated to the purpose. With the right planning and skills, however, there is no reason that you cannot also maintain an effective disaster recovery yourself. Regardless of whether you outsource disaster recovery or not, the most important thing is simply to have a plan in place. (See also: 5 Tips for Developing a Disaster Recovery Plan)
Data Quality and Disaster Recovery
Now that we’ve covered the basics of disaster recovery, let’s discuss where data quality fits in.
Put simply, data quality matters in this context because whenever you are backing up or restoring data, you need to ensure data quality. Since data backups and restores are at the center of disaster recovery, data quality should be factored into every phase of your disaster recovery plan.
After all, when you’re copying data from one location to another to perform backups, data quality errors are easy to introduce for a variety of reasons. You might have formatting issues copying files from one type of operating system to another because of different encoding standards. Data could become corrupted in transit. Backups could be incomplete because you run out of space on the backup destination. The list could go on.
It’s even easier to make data quality mistakes when you’re recovering data after a disaster. Even the most prepared organization will be working under stress when it’s struggling to recover data after a disaster. The personnel performing data recoveries may not be familiar with all the data sources and formats they are restoring. In the interest of getting things up and running again quickly – a noble goal when business viability is at stake – they may take shortcuts that leave data missing, corrupted or inconsistent.
All of the above are reasons why data quality tools should be used to verify the integrity of backed-up data, as well as data that is recovered after a disaster. It’s not enough to check the quality of your original data sources, then assume that your backups and the data recovered based on those backups will also be accurate. It might not be, for all the reasons outlined above and many more.
The last thing your business needs after it has suffered through and recovered from a disaster is lasting problems with its data. To prevent a disaster from having a lasting effect on your business, you must ensure that the data you’ve recovered is as reliable as your original data.
Syncsort’s data quality software and disaster recovery solutions can help you build your disaster recovery plan. Learn about why Syncsort is a leader for the 12th consecutive year in Gartner’s Magic Quadrant for Data Quality Tools report.