“You used to be much more… ‘muchier.’
You’ve lost your muchness.”
– The Mad Hatter to Alice in the film Alice in Wonderland.
Really, what could be worse than losing your muchness? Your mojo?
In the IT world, it can be the death knell of your business, signifying an impending downward spiral or deceleration.
There’s healthy “stick-to-it-iveness” and, well, then there’s, as Rich Castagna, editorial director of Storage magazine puts it in his May editorial, doing something [the same old way] “because that’s the way we do it.”
One’s okay… the other not so much. One keeps your business moving forward and in new directions… the other slows it down, and limits opportunities.
The good news is research data continues to show that more of you are rejecting the notion that doing nothing or the same-old-thing is a good thing … for your organization or for your own professional well-being, for that matter. You see the bigger picture (pun intended!), and you are transforming your backup environments and your thinking.
#1: Your company still views backup as a cost center
As a barometer (of the change that’s occurring in the market today), we polled the live Backup Game Day audience at EMC World last month, to see what’s really going on in customer environments. It’s a good reality check for us, and a good opportunity for users to see where their peers are at. Plus, it’s just a fun way to engage with you.
Anyway, we asked a bunch of questions, but one of the ones that stood out was, “How does your company view backup: as a cost center, a tactical necessity (to protect against “what if” scenarios), a game-changer (i.e., a strategic investment in their company’s future or evolving (i.e., somewhere in-between a tactical necessity and game-changer)?”
Not (too) surprisingly (after all, change is tough, particularly in backup circles), the majority of Game Day respondents said backup was still viewed as a technical necessity. However, 26% said it was a game changer, and only 1% said it was nothing more than a cost center. So, I’m curious, where does your organization fall within this spectrum? Comment below, and let us and your fellow TBW readers know. And if your organization is a straggler, what’s holding you back?
#2: You’ve never heard of PBBA
Short for Purpose-Built Backup Appliance, PBBA adoption is another very good barometer of change, and according to the the just-released IDC Quarterly Worldwide Purpose-Built Backup Appliance Tracker, the market is still red hot! Revenue, capacity and shipments are all up significantly year-over-year, and at EMC still is driving the market, with more than 4x the market share our nearest competitor.
So, what about you? Got PBBA? If you don’t, the following IDC paper is a great level-setting resource: Backup and Recovery Changes Drive IT Infrastructure and Business Transformation. Also, be sure to read Stephen Manley’s blog series: The Right Architecture Is Priceless. In that series, Stephen talks about the role Protection Storage (a.k.a., PBBA) plays in a transformed backup environment.
#3: Your CIO is hyper-focused on saving a buck
Times they are a changing, and so must CIOs.
If you’re a regular follower of The Backup Window and my posts, then this may sound familiar. Click on this link, and it’ll bring you to a post I wrote back in the spring.
The bottom line is that for transformation to be successful, it’s got to start at the top – and extend down and across the organization. This means CIOs need to be 100% onboard with the need for and reality of transformation, and it means re-setting priorities so they’re in line with a services-oriented mindset.
And this is happening…
Of the more than 1,500 CIOs and other IT leaders, IDG surveyed as part of its recent CIO Research study, 49% ranking improving IT productivity as their number-one goal for 2013, followed by better, faster, decision-making; improving service levels; protecting corporate data and increasing agility. As for reducing costs – again, the traditional front-runner – it ranked a distant eight in the survey.
So, what about your organization? Has it lost its muchness? And what about you?
Representing EMC Backup Recovery Systems at Cisco Live, I witnessed first-hand exceptional collaboration between EMC partners around booth activity, social media promotion, and product expertise. VCE is the perfect example. How many VCE Vblock Systems did you notice on the show floor? Check out my Cisco Data Center blog to learn more!
You may have recently read The Backup Window’s May 17 post “Forget the Drapes…How’s Your Plumbing?”, in which Heidi Biggar talks about the important relationship between backup architecture and application deployment, productivity, innovation and ultimately revenue.
Also in that article, Heidi shared a video of Guy Churchward, president of EMC’s Backup Recovery Systems division, at EMC World last month. In this video, Guy compares backup to the plumbing of a house – without backup, it doesn’t matter what the rest of your environment looks like because you won’t be able to scale to address exponential growth due to big data.
I’m going to take that argument one step further and tell you that while having a good backup and recovery infrastructure (a.k.a. the plumbing) IS important, effective management of that infrastructure may require you to mask it. Let me explain.
Modern, unified, non-disruptive data protection infrastructures are complex, though you might not have all the components shown here in play today. It really depends on what the business need actually is.
Starting on the left side we see some virtualized hosts with applications, some physical hosts and primary storage. You may have some particularly challenging mission-critical applications with aggressive RTOs and RPOs. You may be using replication for those. But, all of this needs to be backed up and protected. You’re likely using your backup manager of choice, which may also be backing up your VMs directly. And eventually those safe sets of data are going to make their way down to an archive device.
That’s the infrastructure Guy spoke of – it’s important and it performs a vital task for your business. However, as I mentioned, it’s complex. It’s simply not possible to effectively monitor each data protection component individually, particularly if there are multiple backup applications or many archive devices. Visibility is crucial, and in order to get a holistic, end-to-end view of the environment you need to mask the complexity. That’s where data protection management software like Data Protection Advisor can help.
Case in point: in speaking with many customers over the past few years we’ve learned that the SLAs they were being asked to meet as part of their organizations’ transformation processes weren’t focused on the individual success of an individual backup (i.e., they didn’t care whether a backup occurred on the first, second or nth time of asking) but rather the speed and precision of the overall process. Customers really wanted to know that their data was being protected within specified time periods and that it had reached the designated vaulting location/device.
And to be able to do this, you need to be able to see and manage the entire environment.
Abstracting Management as a Change-Enabler
There is another important capability these tools bring. By separating the management view of the protection infrastructure from the various technologies deployed, IT is empowered to implement operative changes to the environment. (I’ll explain this too.)
Service providers and enterprise IT shops alike are looking for ways to beat out the competition by investing in new technologies that will help differentiate in terms of cost or performance. But by swapping one technology for another, management and visibility of the protection environment are lost, or at least broken. Each new technology brings its own variation on ‘how things should be done’.
However, by abstracting management views of the entire environment away from the underlying technology, the service provider’s management view and control of end-to-end protection processes are buffered from any change in the data protection ‘plumbing’. These management tools become a change enabler (or transformation enabler) by simplifying the environment and removing the worry and hassle that often accompany transformation. In other words, your management tools can become a change enabler independent from your underlying data protection technology.
Somewhat related to this is EMC’s recent announcement of ViPR Software-Defined Storage. You’ve probably heard how ViPR can Virtualize Everything. Compromise Nothing.
ViPR provides a revolutionary approach to storage automation and management to transform existing heterogeneous physical storage into a simple, extensible and open virtual storage platform. This means that organizations don’t have to give up choice as their organizations grow and management costs don’t have to go through the roof either.
With ViPR, organizations get a simple, unified way to manage virtual and physical storage that not only protects their investments today, but can also dynamically adapt and respond to future requirements.
While DPA isn’t quite the same as ViPR, and ViPR is intended for primary storage, the underlying goal is the same: simplify complexity through automation and centralized management.
And that gives you the freedom of choice and the flexibility to select the plumbing components you need to drive your transformation.
“Architecture should speak of its time and place, but yearn for timelessness,” Frank Gehry.
During the EMC Backup Recovery Systems’ keynote at EMC World, Guy “Haybale” Churchward shared his perspective as a British homeowner. His house was built 150+ years ago, and it will stand for another 150+ years. Therefore, while he makes it his home right now, he feels a responsibility to improve it for the next owner (check out his recent blog post). The home ties together people who will never meet. The right architecture, from St Paul’s in London to Hagia Sophia in Istanbul to Guy’s house, can both connect and inspire across generations.
In this series, I introduced the Protection Storage Architecture and explored the Protection Storage component. This time – Data Source Integration. (To start the series at the beginning, click here.)
Data Source Integration – Why Does it Matter?
Performance and visibility. When they are missing, users lose confidence in the protection team. They slow their development. They roll their own solutions. They lose data.
Performance and visibility. That’s how the protection team can drive the business. Faster backups and restores minimize data loss and downtime, reduce management complexity, and increase the likelihood of data recovery. With visibility into the data protection, application teams and end users gain confidence, accelerate innovation, and remain safe.
Performance and visibility. How can the protection team deliver? Data source integration. Each team believes its data source – the application, the hypervisor, the storage array or the server – “owns” the data (in a virtualized world, multiple teams claim data ownership, until things go wrong; then, all of a sudden, it’s the backup team’s data). The data source touches every bit of information that its users generate or access; its management interface provides administrative control. By sitting in the data path, the data source can optimize protection performance. By incorporating protection controls into its UI, the data source can provide visibility to the data owners in their preferred interfaces.
Data source integration delivers the protection performance and visibility that organizations need.
Data Source Integration – Performance
Data sources optimize protection performance compared to traditional backup clients because they sit in the data path.
A standard backup agent works very hard, but not very smart. The agent sits idle until backup time, when it wakes up and looks for new data to protect (I’m assuming you’re running incremental forever versioned replication– if you’re still running frequent fulls, this discussion may feel like you’re sitting in a Peugeot 306, watching the TGV train thunder by). Backup agents look at every file in the data set, checking timestamps to detect whether it has been modified. Yes, the agents look at every … single … file. Once it locates a new or modified file, modern agents then checksum the data to identify the new data within that file (a critical optimization for protecting large files or using a low-bandwidth network). Backup clients run the storage equivalent of a search for needles in haystacks. While this approach is far better than running a full backup (maybe you’re sitting in a Ford Aspire watching South Korea’s KTX2 train zoom past), but customers continue to reach traditional backup clients’ scalability limits.
The data source, on the other hand, can track exactly what data needs to be protected. Whether it is the application, the hypervisor, the storage, or the server, it owns the data. The data source executes the users’ every data creation, modification, and deletion, so it can keep a log, a journal, or a bitmap of those changes. Therefore, at backup time, the data source already knows exactly what to protect. There is no need to look at every file, no need to checksum every chunk. Instead of searching for needles in a haystack, the data source hands the backup process a pre-ordered set of needles. Even better, when it comes time to restore, it can ask for just those needles back!
Some of the leading vendors that can optimize backups via tracking changes include: VMware (Changed Block Tracking), Oracle (Block Change Tracking), EMC (RecoverPoint, TimeFinder Clones, SyncIQ, …), NetApp (SnapDiff), and Microsoft (Filter Drivers and Change Journal). In other words, the options are widespread.
Because they sit in the data path and can track the new and modified data, integration with data sources can reduce backup and recovery time from days and hours to minutes or seconds.
Data Source Integration – Visibility
Data sources optimize protection visibility by connecting to users via their preferred interfaces.
Technology developers define ‘simple’ differently from the rest of us. Take EMC’s “very simple” goal management system. Every quarter, I must approve my employees’ MBOs in this application. While it has a well-designed UI and management flow, you can guess what I’m doing 5 minutes before close-of-business on the MBO deadline. I’m screaming at my computer about the incomprehensibility of the system, the pointlessness of MBOs and the series of Palahniuk-level horrors I want to visit upon HR, IT and the application developers. When you login to an interface once a quarter, no matter how simple, you re-learn it each time. If I could approve via my normal tools – email, bug tracking system or source code repository – MBOs would take under a minute. Now that’s simple!
Regardless of how simple, elegant, or fun… another interface adds complexity, especially when the customer rarely uses the interface. End-users and administrators do not want to log into a backup application interface. They want to see and manage their protection from their primary tool – vSphere, Oracle, SAP, Unisphere, NFS/CIFS share, etc. If their application does not support a protection view, the backup vendor should provide an interface with the same look and feel as their common tool. Only then will they feel comfortable with the protection environment.
Not surprisingly, the same data source vendors who are optimizing the protection data path are also enhancing the protection control path.
Because they are the users’ central interface, integration with the data sources can improve visibility and confidence into the protection environment.
Data Source Integration – Proof that Data Protection Matters
Performance and visibility have driven the decade-long renaissance of protection innovation. The industry’s data source titans understand that data protection matters. Oracle, Microsoft, VMware, NetApp, and EMC have optimized protection performance and visibility. Ten years ago, these vendors would have said, “That’s a backup software problem” or “Upgrade your hardware to get better backup performance” because companies do not spend resources solving “somebody else’s problem”.
Today, they invest because protection has become their problem. Protection is the primary inhibitor to the growth of their big data applications and infrastructure. As the data sources, they have both the incentive and the unique ability to help solve the problem. Their investment to deliver solution demonstrates the importance of data protection to your environment.
Therefore, as you design your protection environment for today and the future, data source integration is a critical component of your architecture. Protection has become integral to the data sources, so the data sources must be integrated into your protection architecture.