T-Minus 2 hours, and Counting – VCE Mega Launch Is Here!

Chandra Jacobs
I love creative and challenging projects in the emerging technology product space. I have a background in tech, innovation, and product development, especially as applied to web and mobile apps in the entrepreneurship arena, but have recently moved into marketing. In my role as a product marketer, I have gravitated toward digital marketing as well as analytics/data mining. It fits well with my techie geek bent as well as my cloud angle on The Backup Window. (Be sure to catch my posts on Innovation Station too!) Outside of work at EMC, I enjoy exploring Boston’s culinary and jazz scene (often in combination), and travel as much as I can (35 countries and counting).

Get your popcorn ready! Today is the day that VCE will be holding their virtual launch, and EMC Backup and Recovery is proud to form the foundation of the VCE’s Data Protection Suite.

The clock is winding down to the 11AM EST event, which you can still register to watch live here. I plan to watch it live from the web with some of the EMC CTO staff here in Hopkinton. What about you?

VCE will be announcing several exciting products. These new solutions all run on the EMC technology you know and love, and push VCE into the small and medium-sized business market, leveraging EMC’s VNXe small-business storage array.

The Data Protection Suite continues to be powered by EMC Avamar and EMC Data Domain, which are optimized for highly virtualized and converged environments. You can learn more about these solutions from our recent webcast, and also download the white paper written by expert industry senior analyst Jason Buffington from the Enterprise Strategy Group (ESG).

Congratulations again to VCE! Having grown steadily for the past 3 years, the company has recently celebrated the following milestones:

  • $1 Billion Annual Run Rate
  • 1,000th Vblock System Sold
  • #1 Market Share – over 57% market share according to the Gartner Group.
I’m looking forward to more great VCE products and solutions in the future, and am excited to be part of the movement to IT convergence.

We Keep Getting Better

Heidi Biggar

Heidi Biggar

Marketing and IT Consultant, Data Protection and Availability Division
I’m often asked how a political science major at Tufts wound up in the IT world, covering backup, storage, virtualization and cloud of all things. Truth is, it’s really a love for learning, a need to understand the “bigger picture” and a desire to share that view with others that’s steered my path over the past 20 years, from campaign manager to editor, analyst and marketer. After hours, you’ll find me hanging with family, running 10ks through Peachtree City’s 90 miles of cart paths, watching football or reading. I’m a New England transplant enjoying life in the South. In my previous life, I also blogged for ComputerWorld, Enterprise Strategy Group and Hitachi Data Systems, but The Backup Window is my baby. It's been great watching it evolve.

Valentine’s Day Present

Phil George

Phil George

Avamar/VMware Guru, Data Protection and Availability Division
Working with customers and partners (like VMware) to develop leading backup solutions makes every day very interesting; helping them optimize their backup architectures for virtualized environments is what really energizes me. Over the past 25 years, I’ve held senior engineering, marketing and sales roles within the technical software industry. This gives me a good vantage point to recognize technical challenges, see emerging trends and propose new solutions. I hold a BSEE from Cornell University and a Masters in Computer Engineering from Boston University. I currently reside with my wife and two children in Massachusetts.

It’s that time of year when love and romance is in the air. I am sure everyone is looking for a good Valentine’s Day Present, so look no farther as I have a great suggestion—well, that is if your loved ones happen to be technically savvy and runs their virtual machines in vSphere. 

If they are, then they need to ensure they  can always recover reliably. And what says “I love you” more than vSphere Data Protection (VDP) Advanced?

Powered by EMC Avamar
Today, VMware announced VDP Advanced, which they and their partners will sell. EMC Avamar, deduplication backup software, powers VDP Advanced. vSphere 5.1 or later is required. Key VDP Advanced’s capabilities include:

  1. Deduplicated backup software and a dynamic virtual appliance that scales up to 8TB
  2. Image-level backup leveraging Changed Block Tracking for both backup and restore
  3. Application agents fr Microsoft SQL and Exchange to ensure application consistent recoveries
  4. Self-service restore
  5. Directly integrated to the vSphere Web Client

Avamar’s production proven quality, reliability and performance make VDP Advanced a strong solution for customers who need fast, easy and reliable backup for their small to mid-sized VMware environments.

Visit the VMware web site for more information:

http://www.vmware.com/products/datacenter-virtualization/vsphere-data-protection-advanced/overview.html

Buy your V D P at:

http://www.vmware.com/products/datacenter-virtualization/vsphere-data-protection-advanced/buy.html

 

How to Stop Database Backup Fights

Gene Maxwell

Gene Maxwell

Technical Marketing, Data Protection and Availability Division
I am known by many as the creator of documentation that helps others easily understand technology. This is because I discovered that I myself was a visual learner as I worked in many different IT roles over the years. Prior to my technical marketing role, I was an EMC technical consultant for six years. I also have many years of experience as a customer in IT responsible for data center management & disaster recovery, including backups. My hobbies include building PCs, collecting movies (Casablanca is my favorite), singing and playing my guitar. I have a twin brother who is three minutes older than I am.

IN THIS CORNER, WEIGHING IN AT 180 POUNDS, LEAD BACKUP ADMIN JOE BACKUP. AND IN THIS CORNER WEIGHING IN AT 175 POUNDS, SENIOR ORACLE DATABASE ADMIN JOHN DATABASE. TODAYS FIGHT WILL BE 10 ROUNDS. THE WINNER: CONTROL OF ORACLE DATABASE BACKUPS.

Sound familiar? 

Is this a regular fighting match in your IT organization over who has control of Oracle database backups and recovery? Are you one of the participants or are you someone on the sideline who’d like to sell tickets and make a few bucks?  The challenges seem endless and everyone’s getting tired of all this in-fighting.

Here are some of the punches we’ve seen thrown:

  • (left) DBAs want daily full backups to maximize critical database recovery, (block) backup team say full backups take too long and are hard to get done within backup windows.
  • (right) DBAs want to be able to do more than one backup on some days, (left) backup team can barely get one done within backup windows and backup resources are needed for other backups.
  • (left) DBAs want to keep weeks or months of full backup retention, (right) backup team says they take up too many resources and want to limit them to a couple of days or at most, a week.
  • (left) DBAs want some backups done now on-demand, not just when scheduled, (block) backup team is busy doing other things on their regular schedule and there are resource limitations of bandwidth and target backup devices.  (right) DBAs are tired of asking, (left) backup team is tired of interruptions.
  • (right) DBAs want reports on-demand regarding backup success and off site copies, (uppercut) backup team is busy doing all the rest of their work and tired of being bothered all the time.

How can you stop the fighting?  

Actually,  it’s not as difficult as you may think it is. In fact, it’s a very simple 2 steps:

  1. First, switch your slow unreliable Oracle database backups over to an extremely fast deduplication storage solution. 
  2. Then give your Oracle DBAs total control of their own backups and recovery using the Oracle RMAN utility that they already know and trust. 

If you’re the backup admin, don’t panic yet, we’ve got you covered too! You can also leverage data protection management software that can give backup admins a single view of all replication and backup – even if you’re not controlling it. Here’s what our RMAN direct to a deduplicated storage solution:

  • Speed up your backups typically by 50% or more so that daily full backups can be completed within backup windows with breathing room to spare for data growth.
  • Ensure data integrity for critical Oracle databases with the industry’s best data protection
  • Allow Oracle DBAs to perform their own DB backups & recovery using RMAN GUI or CLI.
  • Provide RMAN full catalog awareness for all local and DR database copies.
  • Provide longer space efficient database retention through efficient variable length inline deduplication technology.
  • Provide cost effective and bandwidth efficient replication controlled by RMAN.
  • Eliminate slow unreliable physical tape & all the associated problems and risks.
  • Dramatically improve backup & recovery reliability and performance through automatic path load balancing and failover with less failed backup jobs to restart.
  • Provide the ability to establish logical quotas to limit shared deduplication capacity to agreed upon limits with the DBAs.  One threshold issues warnings, another will stop new backups.
  • Eliminate expensive backup application licensing for Oracle databases because the DBAs will be using Oracle RMAN utility to perform database backups and recoveries.
  • Stop the fighting and let the Oracle DBAs do their own backups and let the backup administrators focus resources on the rest of their backups.

If you’re ready to stop the fighting, improve your Oracle database backups, and make your DBAs happy, we’re ready to help with the industry’s leading Oracle database backup and recovery solution.  It’s like having your cake and eating it too,

Tape Is Dead, Part II

Stephen Manley

Stephen Manley

CTO, Data Protection and Availability Division
Over the past 15 years at both EMC and NetApp, I have traveled the world, helping solve backup and recovery challenges - one customer at a time (clearly, I need to optimize my travel arrangements!). My professional mission is to transform data protection so that it accelerates customers’ businesses. I have a passion for helping engineers pursue technical career path(without becoming managers), telling stories about life on the road and NDMP (yes, that’s NDMP).

How should I back up data that doesn’t deduplicate? It’s one of the questions I’m asked often – by both our engineers and our customers. In fact, a TBW reader raised the issue in response to my recent post. Therefore, I’d like to explain how we approach such fundamental challenges and then share the approaches that I recommend to our customers. 

The Fundamental Challenge
Difficult challenges require a system-level solution approach because the problems are too complex to be solved by one component. It is this systems view that drives my push to transition from tape to disk.

Over the past twenty years, tape-centric backup systems have evolved about as far as they can. Meanwhile, disk-centric backup continues to evolve rapidly because disk storage systems alter the constraints in the system. Therefore, “backup to disk” isn’t code for “write a tar image to a Data Domain VTL” (especially since VTL still implies a tape-centric backup approach).

Usually, one of the disk backup approaches can meet our customers’ RPO/RTO and reliability needs at the right cost… or come closer to the mark than anything else available. More importantly, with both the freedom and investment to innovate, disk-centric backup architecture will more effectively address IT challenges today and in the future.

The Approach: Four Use Cases
There are four “non-dedupe” backup use cases I hear about:

  1. Low-retention, non-repeating data (e.g., database logs): Customers usually choose between two options: Option 1: Store the logs on the backup appliance, getting only local compression, but with consolidated protection storage management.  Option 2: Store the logs on non-deduplicating disk systems and coordinate the storage management (e.g., replication). Regardless, disk is usually the best option to handle the performance requirements for high value data with such an aggressive half-life.
  2. High churn environments (e.g., test data): These data sets experience 30%+ daily change. Most customers opt for short-term retention because the data is so short-lived. In that case, I recommend snapshots/clones and/or replication. While the snapshots consume a significant amount of space, they save a tremendous amount of IOPs. Too often, organizations ignore the heavy I/O load caused by backups. Not only are most of the backup reads not served from cache, but they often pollute the cache.  In high-churn environments, IOPs are even more precious, since the storage system’s disks are so heavily loaded with the application load (and the churn makes flash a non-ideal fit). Therefore, at a system level, it is often less expensive to consume extra space for snapshots than to consume the IOPs for traditional backups.

    As an additional benefit, the snapshots enable faster recovery from current versions of data. The choice to replicate becomes a cost/benefit analysis around the availability of data vs. the cost of a second storage array and network bandwidth. Tape-centric approaches compromise application performance (or require overbuying the primary storage performance), recover stale copies of the data, and recover the data so slowly that customers prefer to regenerate the data (e.g,. application binaries, satellite images, oil and gas analytics, or rendered movie scenes).

  3. Environments in which you don’t run multiple full backups and have little cross-backup dedupe (e.g., images, web objects, training videos): If data is never modified and rarely deleted, customers don’t run full backups. Since a backup appliance derives much of its space savings from deduplicating redundant full backups, dedupe rates fall in the absence of multiple fulls. The best approach for protecting these data sets is replication, especially if the replicated copy can service customer accesses.

    Since the data is not modified, there is little value from retaining multiple point-in-time copies. Therefore, the most critical recovery path is that of a full recovery; nothing is faster than connecting to a live replica, nothing is scarier than depending on multiple incremental tape restores. Furthermore, these types of datasets tend to have distributed access patterns, so technologies like EMC’s VPLEX can improve both protection and performance with the same copy (another way of deduplicating copies).

  4. Environments in which the application behavior compromises dedupe (e.g., compressing data that you modify): Think of an application that either modifies compressed files in place (e.g., open file, decompress file, modify file, recompress file) or creates multiple compressed copies of data (e.g., compressed or encrypted local database dumps). This workflow tends to create 10x more data modification than the actual new data.

    In these cases, you have two options:  Option 1: Decompress the data for the backup and/or write the database dumps directly to the dedupe storage, so you can get the optimal deduplication. Option 2: Treat the data as Type 1 or Type 2 discussed above.

    However, if the customer is unwilling to decompress the data and wants long-term retention, this is the most plausible instance in which to leverage tape. I’m just not sure it’s widespread enough to justify deploying a tape environment; I would fully explore cloud options first.

When I advocate for disk, I’m asking the industry to both consider at the entire portfolio of disk solutions and the possibilities that can be developed. As we’ve been discussing on LinkedIn, as soon as you make disk your design center, it opens a whole new set of architectural approaches. And that’s the transition that is so exciting – moving from putting disk inside a tape-centric architecture to really designing around disk.

As you can see from the examples above, the most challenging environments for data protection require a system-level approach. In fact, some of them demand approaches that look beyond just the protection infrastructure. As we’ve talked about in the past, backup teams need to connect with application, virtualization, and storage owners to provide the services that their users need. With those connections, they can deliver better integrated, more innovative solutions to their customers.