A little more than 10 years ago, I wrote an article for InfoStor magazine exposing IT’s Dirty Little Secret: Backup was grossly inefficient – and IT knew it.
Back then, nobody talked about the backup problem because there was little that could be done about it. Backup teams either lacked the tools to determine if they were backing up everything they were supposed to or they couldn’t abstract the the information they needed with the tools they did have. And, so, discussion in IT shops centered around backup speeds and feeds (of tape devices, mind you)… there was little talk of recovery. Scary.
Five years later, disk-based backup and the increasing adoption of deduplication technology revolutionized the way everyone thought about and did backup. IT conversations shifted from “Is my data protected?” to “How fast can I recover my data in the event I need to?” Discussions focused on improving RTOs, RPOs and continuing to reduce backup windows. All was good.
Today, while much of the conversation in IT shops still centers on backing up and recovering faster and efficiently, there are new concerns about the ability of backup to keep pace. Once again, issues of trust—albeit different ones—are surfacing; only this time the stakes are much, much higher. If you’re thinking in terms of downtime costs. Think again. There’s a direct link between backup and application deployment, productivity (business and IT), innovation and revenue.
At EMC World last week, Guy Churchward, president of EMC Backup Recovery Systems, talked about the new strategic relevance of backup to an organization in a Cube Interview. Churchward compares backup to the plumbing inside a house: “If your house doesn’t have a good infrastructure, it doesn’t matter what the drapes look like,” he says.
So, how’s your plumbing?