As my colleague Terry Grulke pointed out earlier, there is lot of funny math used by deduplication vendors to try to convince you that their system can go fast. With our DXi systems we don’t have to hire Cirque de Soleil to generate our performance numbers. We can keep it simple because DXi systems are just really, really fast – natively. That’s what I’m going to talk here about here – “Native” performance. That is, the capability of the DXi system itself vs. some manufactured “logical” number like the ones Terry wrote about. Apparently, our high performance is confusing to some of our competitors.
Why do companies continue to store that data on their most expensive, highest performance storage? A better approach – and a way companies can completely rethink their backup and archive approach to unstructured data – is to employ a tiered storage approach. Quantum is a specialist in designing tiered storage solutions for unstructured data – we’ve been doing it for years in the most demanding data environments like M&E, Government, and Oil and Gas. So we can use some of our core technologies and our core approach, to design tiered storage solutions for the data center.
There are two areas in the data center where we think companies can completely rethink how they are storing, protecting and providing access to their ‘non-flash’ i.e. ‘non immediate work’ data based on a tiered storage approach. And they can do it TODAY.
How do they go so fast? We are continually battling competitors stating ingest performance using numbers that defy logic. That is, we compete against systems that have four x 10GbE ports that supposedly ingest at 100TB/hour. The following example is not to create debate about specific mathematical accuracy but to educate you on how they are “cooking the books”.
IT departments today are rapidly deploying virtualization technologies – in fact, Gartner reports that server virtualization is already over 60% penetrated and projects the market will be over 80% penetrated by 2016. With this rapid rise in virtualization deployment, customers are challenged to incorporate data protection and archive methodologies for their virtualized data. ESG recently indicated that 60% of virtualization technology users are planning to address challenges associated with data protection for their virtualized data as a top priority for 2013, an astonishing number. There are certainly lots of options available to protect traditional data, but virtualized data is a different beast. Another Gartner report identifies that over 1/3 of organizations will change backup vendors due to factors including cost, complexity, or capability. Based on what I’ve heard talking with customers, I would add one more factor: compatibility. In this article we will explore all four of these areas and suggest ways to overcome the challenges associated with virtualized data protection.
After publishing my blog yesterday on the need for application support of object storage to break the logjam in adoption….it occurs to me that some of you may be asking the question: “Janae, if object storage really is so cool and the gap in object storage adoption is data mover application providers writing to this new technology, why haven’t these developers quickly moved to fill this gap?”
One of the best parts of my job is talking with customers. It really helps keep Quantum aiming its solutions at the most important problems. So I jumped at the recent chance to moderate a discussion of 25 IT execs at the Northwest CIO Executive Summit in Seattle, which is also home for me. The topic was […]
In my first two blogs, the discussions have been around using native format for amazing fast restores and for booting VMs remotely, all without actually needing the backup application. In this final post I will explore how native format can provide “future proof flexibility.” When I hear this term, I am always reminded of the Best Buy commercial where consumers are constantly, and humorously, reminded of the fast pace of technology. If you are like me, having something new and shiny is always nice, however, life gets in the way and we need to spend our finances elsewhere. The same goes for our IT budgets.
If you had the chance to read my first blog in this series, you will remember that vmPRO is the only backup application that writes data in native format, which can guarantee fast restores using vmPRO GUI or a standard file browser with easy drag and drop functionality. In this post, the second of this three-part series, I will cover how the native backup format can help transform your disaster recover (DR) strategy by booting VMs and restoring files remotely (no matter the target device — disk, tape or cloud) all without actually using the backup application.
I love talking to customers about their VM backup challenges. One of the most exciting topics for me is the native file format capability of Quantum’s vmPRO. In this three part series, I’ll talk about three reasons you’re going to love backing up your VMs in native file format. While proprietary formats have been the norm in the storage industry, as technologies and data access requirements are evolving, customers no longer need to settle for this traditional method. Think about how native format works when it comes to simple data transfer and storage. For example, when you want to protect your precious digital pictures and store them on a thumb drive or USB hard drive, you simply drag and drop the files to the device. When you want to restore them, you simply plug the device into your computer and drag and drop the files back to your system. vmPRO takes that simplified concept and applies it to the business-class level with virtual data. Why over-complicate things?
When I learned that Quantum had picked up two honors at the 2012 Storage, Virtualization and Cloud Computing (SVC) Awards in London this year, my first thought was that no one carries off wearing a tux quite like the British. Look how much fun they’re having! In contrast, most American males view the tux as a form of punishment, and we rent rather than buy them because, for many of us, our waistlines are likely to change between tux-worthy events. The recent SVC honors Quantum received went to the DXi V1000 virtual deduplication appliance, which was awarded “Virtualization Product of the Year.” Additional accolades went to Quantum’s StorNext data management software in the Storage Software Appliance category. Sounds like it was quite a party.
I was talking to an SMB customer this week, and he was raving about the benefits of deduplication. Deploying deduplication allowed him to completely rethink his data protection policies and reduce his management time. He’s in a smaller IT department where, as he said, “we have to wear many hats and don’t have the luxury of in-house specialists. It was so simple, no worries about backup windows, or any of those details that we used to regularly consider and tweak with the older backup system.” His statement got me thinking, the same type of change is happening with VM data protection. As noted in this previous blog from TeamQ, VM deployments are growing rapidly, but protecting VM data creates some unique challenges.
If you have questions about data protection for virtual machines, we invite you to download “VM Data Protection for Dummies, Quantum Special Edition.” This free e-book will help you identify your VM protection needs, ask the right questions of potential vendors, and choose the VM protection solution that’s right for your organization. The FAQ chapter covers 10 frequently asked questions about VM data protection, such as “How does VM backup work with deduplication appliances?” and “Do I have to choose between one of the new VM-only backup applications and my legacy backup software?”