Quantum is ringing the closing bell at the New York Stock Exchange today in celebration of its 35th anniversary this year, so I’ve been thinking recently about the history of the storage business. It’s been a very interesting ride. Most extraordinary is the degree to which the role of data storage has changed. Data storage have moved from the edge of the data processing universe to being firmly at the core. Data storage has gone from being a peripheral playing a supporting role to being the belle of the IT ball. It’s an exciting time (again) to be in the data storage industry, particularly for Quantum because of the central role our StorNext scale-out storage solutions have long played in enabling customers to organize, protect and manage their data so they can leverage it for strategic advantage. As Quantum commemorates its 35th anniversary, we look forward to helping more organizations maximize the value of their data.
It’s so exciting to see today’s announcement that Quantum will be taking on a larger role in selling Dot Hill storage. This is not just another storage channel deal – let me explain why. The partnership of Quantum and Dot Hill began in workflow storage. Quantum enjoys a unique position in the market in serving the needs of the most demanding workflow storage applications – from video production to surveillance, cybersecurity and research. These applications have storage needs which are quite different from traditional corporate applications. They need different solutions. Both companies have noticed that there are adjacent markets and customer needs where tighter collaboration between us—leveraging the technology of Dot Hill and the technology and channel reach of Quantum—can deliver broader value to our customers and partners. Which brings me to the announcement today: Quantum is becoming the branded face of Dot Hill storage.
As you may have heard already, there’s exciting news today in the object storage marketplace: Western Digital Corp., a leader in storage technology, announced that its HGST subsidiary is acquiring Amplidata, Quantum’s object storage technology partner. We’re happy for Amplidata and looking forward to expanded partnership opportunities with WDC and the HGST group. As a reminder, Quantum announced in 2012 that we were leveraging the performance and availability of Amplidata’s object storage technology by embedding it in our Lattus family of unique active archive solutions. Since that time, many Quantum customers have been able to increase the value of their data by extending cost-effective online access to massive volumes (PBs) of information, so let's look at how this announcement is great news for three major reasons.
As I noted in an earlier blog, customers planning to move data applications (e.g., backup and archive) to the cloud must consider five key factors in selecting and migrating data. These are: Ongoing data transfer volume, expected frequency of ongoing data usage, data recall performance requirements, and application integration. In the next several blogs, I’d like to illustrate the importance of these factors by illustrating how they impact your design and planning as you migrate a few common data use cases to the cloud. The four use cases we’ll consider are: Site disaster recovery, data center off-site copy (for backup), compliance archive, remote site primary, archive and backup and ongoing management. Let’s start by looking at central site disaster recovery.
Businesses of all size are increasingly starting deployments of cloud-based data, driven by the promises of greater agility, lower management cost and capital savings. It just makes sense. When both compute and data move together – in lockstep – to the cloud, the issues to be considered are very similar to deploying or migrating an onsite application. But when the major compute operations are staying onsite and only the data is moving offsite (such as for backup, disaster recovery or compliance archive), the deployment can be more complex. In this scenario, operational executives must consider five key issues to ensure a successful experience – including meeting customer service level agreements (SLAs) and staying within budget. Based on the use case you are planning, you must consider these 5 issues.
As we get ready to say goodbye to 2014, our thoughts turn to what lies ahead in 2015. If it’s anything like this year, it will be another exciting – and interesting – one for storage. With that in mind, here are some of my thoughts on what’s in store for storage in the coming year.
As we approach the close of the year, it’s natural to take a moment to reflect on the year’s events. This year, there’s more than ever to appreciate. Whether we consider product awards, new solution offerings, compelling new customer installations, or overall business growth, 2014 has arguably been one of the most exciting years in Quantum’s 30+ year history. I want to highlight a few of these – and take a minute to thank you – our customers and partners – for a great year! So let’s review….
A couple of weeks ago I was lucky enough to participate in the Cloud IT Expo in Santa Clara, getting a fresh look at what’s new in the cloud. One of the reasons for my being there was to reflect on the massive change that’s occurring around the cloud gateway market. These changes are important to note whether you are a gateway customer today – or not. So for those of you who were NOT at the Expo, here are my Ten Things to Know about What’s Happening with Cloud Gateways.
Chances are, if you are having backup problems, your issue is caused by large unstructured data. Among that mass of unstructured data that is making backup difficult, you're likely to find an increasing amount of data that never changes. Industry analysts have observed that by the year 2020, a full 50% of the data passing over the network (and stored somewhere) is going to be video and images. So the question becomes: Are you ready to look at the composition of your unstructured data to see what data has snuck in? And find an easy archiving software you can install to migrate this data OUT of your active data pool and out of your backup data process?
Today, Quantum announced the acquisition of the Symform cloud storage platform, serving over 45,000 customers. As a Quantum customer, or prospect, why should you care? For years, customers have relied on Quantum, as a specialist in scale-out storage, backup and archive, to deliver the best workflow and target storage. These target devices have been everything from cost-effective managed Scalar tape libraries, to highly scalable online Lattus object storage, to extremely efficient DXi disk deduplication appliances. In combination with our ISV partners, we’ve delivered workflow, backup and archive solutions with high quality end-to-end support. Survey after survey has shown that customers trust Quantum to deliver the most cost-effective, successful solutions for their storage use cases. But in the new world, there are increasingly cases where customers need the target device to be a cloud service.
Historically, the storage industry, simply put, sucks at agreeing on – and deploying – open standards for anything. This fact makes huge sense when you consider that the “standardized” segments of the storage business (e.g., raw disk storage) have survived for years on razor thin margins per disk, while the software and system value-add that have floated on top of this core architecture have been priced at anywhere from rational margins to excess profits. Nobody wants to give up those margins! Startups need those margins to innovate while the major system and storage houses who exert a level of market control simply love the ROI. Nobody really wants an open standard – unless by some chance it is constructed to allow customers to move off the competitor’s offerings and onto “mine.” This win-lose mentality results in a lot of talk (and meetings) about open standards and products, but very little action. SNIA is a tissue paper tiger. Enter – the cloud.
As I’ve said in prior posts, keeping data in native format for later use is increasingly a “must have” for many customers. This is the starting point. Stage two is, of course, turning raw data into useful information by adding knowledge or context. Before you can transition data into business information, you also must find the pieces of data that are interesting or useful. In the media and entertainment world, this is done predominantly through a concept called “metadata tagging.” Metadata tagging is a process by which every unique data element (for video, this would be a frame) is enriched with business information likely to identify its value.
Lately, I’ve been spending a lot of time exploring the differences between data (as in “Big Data”) and information. There’s a very interesting conceptual model that has been proposed outlining the relationship between data, knowledge, information, understanding, and wisdom (D-K-I-U-W for brevity’s sake) attributed to American organizational theorist Russell Ackoff. For a nice introduction to this model, you can read the article “Data, Information, Knowledge, and Wisdom,” by Gene Bellinger, Durval Castro, and Anthony Mills.