T oday, everyone seems to understand the ever-growing importance of data protection, often viewing it as a superset of backup combined with snapshots and replication. Typically, a conversation about data protection includes the assumption of a “gold standard” centered on using secondary disk for rapid recovery and tertiary tape for long‑term retention. Of course, “the cloud” also is always a consideration as part of the next generation of the solution.

It’s still all under the banner of “data protection” (DP), the collection of activities, methods, and media used to help recover or restore business information after a crisis or other IT disruption. But, in much the same way that data protection is the superset of the activities listed above, data management (DM) is a superset of not only DP (reactive) but also data archiving or preservation activities (proactive).

Why it matters

According to research, primary storage is growing around 40% annually, with secondary storage used for data protection growing at similar rates. Budgets aren’t growing nearly that much. Meanwhile, IT organizations are being asked to do more (i.e., inject more agility, functionality, and resiliency into their operations) while spending as little budget money as possible. In actuality, data protection budgets are growing around 4.6% annually according to ESG research, but that level of increase won’t even let you keep doing what you have been doing at a larger scale.

Therefore, you have to do something different.

What you should do

ARCHIVE!

Today, data can offer great value in regard to generating revenue and fulfilling mission objectives. Still, at most organizations, only a minority of data has long-term business value for operations or regulatory compliance. Therefore, the chances are that you have too many backups—and you may not have any archives.

  • Backups are copies of containers (directories, volumes, mail stores, VM-hosts, etc.) that have significant recovery value from about two weeks to 12-24 months. Recoveries from less than two weeks are arguably better achieved from snapshots. And most backups, after two years, are usually burdened with mostly useless copies.
  • Archiving is the intentional retention of data that is specifically identified to have long‑term business value, have sporadic-but-presumed referential usage patterns, or be mandated for retention to achieve industry or regulatory compliance.

Using those definitions, data preservation would immediately solve some “do more/spend less” challenges by reducing the CapEx, media, and management/OpEx costs associated with maintaining backups that have dramatically declining value. If you want to archive data and keep it available in a simple manner such as NAS, then you won’t have nearly as many backup tapes, backup storage disks, or backup cloud pools to oversee.

If you aren’t convinced, consider the amount of stagnant data on your expensive production storage. If you simply groomed away that stagnant data, then you’d have lower primary storage growth, which also means less backup storage to maintain and manage.

Data Management ESG Jason Bluffington Quantum

Check out Jason’s video: Everyone Should Archive, Period

Putting it all together

Putting it all together, we end up with a “good, better, best” model:

  • Good: Protection via a combination of backups, snapshots, replication to a hybrid pool of disk, tape and/or cloud as part of ensuring the agility and recoverability of IT.
  • Better: Preserving the right data for a longer time span so that you can back up far less. This is the first real step in doing more, spending less.
  • Best: Data management, including not only proper preparation to retain the right data (long-term preservation) for reactive retrieval, but also proactively grooming primary storage.

ESG observes that at most organizations, far more than 40% of the data becomes stagnant in less than a year. And if this is the case, it actually counteracts the 40% YoY storage growth that organizations are experiencing today. To put it another way, you could very likely archive (groom off) data faster than it grows, resulting in a flat storage consumption model … really!

Where does Quantum fit into all of this?

Because I am so passionate about “data management” instead of just “data protection,” the nice folks at Quantum asked me to write a series of blogs. Their idea makes particularly good sense because Quantum has a rich set of offerings that align with the ESG guidance above. Specifically:

  • For demanding workflows in areas such as surveillance, geospatial, or video content, Quantum’s StorNext storage automates data management.
  • Quantum has now leveraged its StorNext data management software to deliver an active archive solution known as Artico. Artico is a NAS appliance with disk storage on the front end for active data and automated StorNext-based tiering to migrate data to tape, object storage, or the cloud as the data ages.
  • The core of any good data protection solution is deduplicated disk. For that, Quantum offers its DXi-Series deduplication appliances. To meet modern service-level agreements, nearly all data recoveries should come from disk, but in order to deliver that capability in a cost‑effective way, deduplication is a must.
  • As discussed above, data preservation is really about storing the right data for long periods, and it is hard to argue the economics of tape for that purpose, particularly with the performance and durability of modern LTO cartridges. Quantum is a core innovator in that area with its LTO products.
  • And because no conversation about IT modernization is complete without the cloud, Quantum offers cloud‑based services as a complement to its disk and tape solutions. The services are particularly well-suited for data that doesn’t require tape‑based regulatory retention, or when Quantum’s customers want the agility of BC/DR by actually “failing over” to the cloud for even more IT resiliency.

But wait; there’s more. As you start to think about how hybrid architectures make sense, you may realize not everything will fit nicely into block- or file-storage. Quantum offers object storage and software designed for both backups and archives.

Ultimately, as you evolve beyond data protection to full-fledged data management, your architecture will undoubtedly include multiple media, methods, and approaches beyond just recovery preparation. It doesn’t mean you have to settle for disparate components from disjointed vendors, all knitted together—particularly when there are vendors out there offering a comprehensive catalog, as Quantum does.

Learn More About Archiving

Zettabytes of data, unstructured content growth, and the ever-increasing value of data are putting a strain on traditional storage infrastructure and backup. Find out how to reduce the pain of backup and reduce costs on our ROI of Archive solution page.

Recommended Posts

Leave a Comment