We’re excited to be introducing our newest backup appliances: The DXi9000 and DXi4800. Our Enterprise backup customers continue to be focused on faster and faster backups and restores, as well as reducing rack space, reducing power and cooling – in other words reducing the “footprint” associated with their backup infrastructure.
At Quantum, we have been delivering both approaches for decades with our scale-out products, engineered for large and data-intensive workloads, and our scale-up approach for data protection. Why both? Because Data Protection and Production functions have different requirements.
Top Three Reasons Not to Believe the Deduplication Appliance Datasheets : Reason #4 (There is always more)
To follow up on reason 2 (marketing metrics are not your metrics), I recently found another interesting case. I saw on the news that a major deduplication appliance vendor had just released new models. So, I downloaded and started reading their datasheets and technical presentations (as always), and one thing surprised me—the very high stream counts. The stream count was always accompanied by keywords such as “up to,” which means “less than or equal to,” so in other words nothing is guaranteed. Because the devil is always in the details, I searched for other keywords such as “concurrent (streams)” and so on without success. I also found strange figures, such as outbound replication stream counts higher than inbound, which is unusual. (Most customers need more “fan ins” than “fan outs.”)
It was great to see two different pieces of industry news this week that validated our DXi technology as well as the ongoing strength of the deduplication appliance market as a whole.
Storing data is easy (well, not that easy). But turning it into meaningful business value requires technology partners that know how to integrate well and create something bigger. Customers want products that can create a real solution to a real problem. That’s the Quantum approach and that’s why we’re so excited for this year’s VMworld. We’re “Kicking the Cartel” and helping companies break free from traditional, old school ways of storing and protecting data. If it’s time for your organization to break free from paying $2,500 per TB for backup software, stop by the Quantum booth at VMworld. We’ll show you how to create a different approach to storage that centers on the highest performance and the lowest TCO. We’ll be showcasing all of our technology at VMworld too, including our QXS hybrid storage for VM primary storage and our Artico NAS appliance for archiving. Sign up to meet with the Quantum Storage Experts at the show and you could win an Apple Watch too.
Today, everyone seems to understand the ever-growing importance of data protection, often viewing it as a superset of backup combined with snapshots and replication. Typically, a conversation about data protection includes the assumption of a “gold standard” centered on using secondary disk for rapid recovery and tertiary tape for long term retention. Of course, “the cloud” also is always a consideration as part of the next generation of the solution. It’s still all under the banner of “data protection” (DP), the collection of activities, methods, and media used to help recover or restore business information after a crisis or other IT disruption. According to research, primary storage is growing around 40% annually, with secondary storage used for data protection growing at similar rates. Budgets aren’t growing nearly that much. Meanwhile, IT organizations are being asked to do more (i.e., inject more agility, functionality, and resiliency into their operations) while spending as little budget money as possible. In actuality, data protection budgets are growing around 4.6% annually according to ESG research, but that level of increase won’t even let you keep doing what you have been doing at a larger scale.Therefore, you have to do something different. What you should do: ARCHIVE!
By now I’ve participated in quite a few lab validations with industry analysts, many of them testing the DXi-Series. It has been interesting to see the progression of the DXi as deduplication has evolved to take a more vital role in data center workflows, extending data protection to the cloud. Recently Storage magazine/SearchStorage.com awarded the DXi6900 the Silver medal in the 2014 Products of the Year backup hardware category, adding to the industry recognition the DXi has garnered since its introduction, and highlighting the role of the StorNext 5 file system in the appliance. Industry analyst Tony Palmer with ESG Lab has conducted more DXi lab validations than anyone and truly understands the deduplication marketplace, so he recently put the DXi6900 through its paces in a lab testing.
Quantum's DXi6900 proudly received the Silver Award in Storage Magazine/SearchStorage.com's 2014 "Product of the Year Backup Hardware" category. This industry recognition comes just as industry analyst ESG released the results of their recent lab testing, validating DXi6900’s performance claims. Both the award and lab validation are a big deal for us and reinforce what we’ve already been hearing from our happy customers: DXi6900 is a high performance appliance that is very well suited to the needs of mid-size and enterprise-scale companies as well as managed service providers.
As I noted in an earlier blog, customers planning to move data applications (e.g., backup and archive) to the cloud must consider five key factors in selecting and migrating data. These are: Ongoing data transfer volume, expected frequency of ongoing data usage, data recall performance requirements, and application integration. In the next several blogs, I’d like to illustrate the importance of these factors by illustrating how they impact your design and planning as you migrate a few common data use cases to the cloud. The four use cases we’ll consider are: Site disaster recovery, data center off-site copy (for backup), compliance archive, remote site primary, archive and backup and ongoing management. Let’s start by looking at central site disaster recovery.
As we approach the close of the year, it’s natural to take a moment to reflect on the year’s events. This year, there’s more than ever to appreciate. Whether we consider product awards, new solution offerings, compelling new customer installations, or overall business growth, 2014 has arguably been one of the most exciting years in Quantum’s 30+ year history. I want to highlight a few of these – and take a minute to thank you – our customers and partners – for a great year! So let’s review….
Since launching the DXi6900 in July, we’ve seen remarkable customer interest. That interest is one of the key drivers of the 11% year-over-year growth in DXi revenue we reported last quarter. Now another data point showing how the DXi6900 and the entire DXi family stacks up against other deduplicating backup appliances has just been published: Industry analyst DCIG issued their annual buyer’s guide and the DXi6900 earned a “recommended” rating, with just .45 points separating it from the top spot. In fact, DXi appliances took three of the top 6 spots in the list. So why is the DXi6900 getting so much attention? And how do DXi's deduplication solutions work for "real world" customers?
Gartner just published their first annual Magic Quadrant for the deduplication appliance market, and I think it’s an accurate portrayal of the market, as well as good validation of Quantum’s DXi deduplication technology. The Magic Quadrant itself is shown below (the full report can be obtained via Gartner), and Quantum was the only vendor ranked as a “Challenger.” This is a great ranking for Quantum and based on Gartner’s methodology this reflects a strong ability to execute, capable products, and the financial resources to sustain continued growth. The strengths that Gartner highlights in their report validate our technology and some of our key differentiation.
Deduplication is now widely recognized as a proven technology in the datacenter. In fact, it seems to be cropping up everywhere – from flash arrays to backup applications, and of course disk backup appliances. There’s no end in sight for structured and unstructured data growth, and with the proliferation of technology like deduplication, it is not unusual that the level of complexity increases, as does the challenge to keep it in check. A good first step is to recognize where deduplication can best be applied. Here are a few of the considerations.