It’s been exciting over the last year to see our revenue and work with Managed Service Providers grow – both as partners and as customers. There’s no doubt that the transition to the cloud is accelerating and by 2020 we expect at least 40% of data will touch the cloud during workflow. Already, many Enterprises are outsourcing their backup storage, DR storage, maybe even their entire infrastructure to these MSPs. With Quantum’s business as a specialist in scale-out storage and data protection, we’re gaining a better perspective and experience on the service provider business.
Lately, I’ve been spending a lot of time exploring the differences between data (as in “Big Data”) and information. There’s a very interesting conceptual model that has been proposed outlining the relationship between data, knowledge, information, understanding, and wisdom (D-K-I-U-W for brevity’s sake) attributed to American organizational theorist Russell Ackoff. For a nice introduction to this model, you can read the article “Data, Information, Knowledge, and Wisdom,” by Gene Bellinger, Durval Castro, and Anthony Mills.
CRN recently recognized the who’s who of channel management – leaders who are heading up Partner programs that solution providers depend on to run their businesses. This year CRN profiled Steve Burrows, VP of America Channel Sales, for navigating new business models, technology shifts and making sure Quantum partners succeed. Read on for more information about Steve, his program and his channel goals and advice.
Data protection strategies have been in a state of accelerated evolution over the last five years. I hear this confirmed regularly by customers describing their implementation stories with Quantum, as well as by the industry analysts that we meet with to discuss our latest product innovations. ESG’s Jason Buffington is one of those analysts that we talk with often, and it’s always interesting to see how ESG’s research squares against what we’re seeing in data centers. Jason’s latest video blog about modernizing data protection – 8 Suggestions for Every Data Protection Strategy – highlights ESG research that resonated with me in a number of respects.
In Newtonian mechanics, momentum has a direction as a well as magnitude. If Newton was correct, and I am going to go out on a limb here and assume that to be the case, then the Powered by Quantum MSP Program has momentum, with positive direction and high magnitude. Over the past couple of weeks, Quantum has successfully created partnerships with a number of MSPs that deliver their own cloud backup service powered by Quantum technology. Just this month we added two new MSPs to the roster, Elanity Network Partner and Interconnekt. These partners, scattered across the globe, have recognized the benefits that Quantum solutions can bring to not only their customers but also to their bottom line.
This article originally appeared on Wired Magazine’s Innovation Insights. With the start of the new year, it’s time once again for those of us in enterprise storage to look ahead and offer our predictions for what the industry will see in 2014. So without further ado, here are ten trends that will have a big impact in the coming year.
I know a lot of folks think the big contest this time of year is the Super Bowl playoffs. In Quantum’s Denver, Bay Area and Seattle offices we’re sporting the colors of the Broncos, 49ers and Seahawks, with just a bit of friendly trash talk to kick off conference calls. Perhaps you know someone rooting for New England – I don’t. But if you care about data storage, the other big contest is Storage Magazine’s Product of the Year Awards. The award serves as an annual reminder of what the storage community found important, promising, and profitable. This year’s award finalists include a cross-section of Quantum products spanning scale-out shared storage and the data center, highlighting the breadth of innovation from the company over the last year. For 2013, four Quantum products – more than any other vendor among the finalists – have been selected in three award categories.
With the introduction of cloud there has been a lot of talk, including jokes about how one can get started in the cloud. We see customers all the time trying to figure out what they can do from a cloud strategy perspective and how this will impact (positively or negatively) their current infrastructure, mainly around budgets. Cloud certainly has the ability to provide some relief in terms of finances – allowing you and your team to focus on more strategic projects – so why not get started with using cloud technologies, particularly when it comes to backup. Quantum recently announced a cloud based backup program for MSPs and VARs that delivers a number of fantastic benefits. Read more to learn more.
As my colleague Terry Grulke pointed out earlier, there is lot of funny math used by deduplication vendors to try to convince you that their system can go fast. With our DXi systems we don’t have to hire Cirque de Soleil to generate our performance numbers. We can keep it simple because DXi systems are just really, really fast – natively. That’s what I’m going to talk here about here – “Native” performance. That is, the capability of the DXi system itself vs. some manufactured “logical” number like the ones Terry wrote about. Apparently, our high performance is confusing to some of our competitors.
Why do companies continue to store that data on their most expensive, highest performance storage? A better approach – and a way companies can completely rethink their backup and archive approach to unstructured data – is to employ a tiered storage approach. Quantum is a specialist in designing tiered storage solutions for unstructured data – we’ve been doing it for years in the most demanding data environments like M&E, Government, and Oil and Gas. So we can use some of our core technologies and our core approach, to design tiered storage solutions for the data center.
There are two areas in the data center where we think companies can completely rethink how they are storing, protecting and providing access to their ‘non-flash’ i.e. ‘non immediate work’ data based on a tiered storage approach. And they can do it TODAY.
How do they go so fast? We are continually battling competitors stating ingest performance using numbers that defy logic. That is, we compete against systems that have four x 10GbE ports that supposedly ingest at 100TB/hour. The following example is not to create debate about specific mathematical accuracy but to educate you on how they are “cooking the books”.
IT departments today are rapidly deploying virtualization technologies – in fact, Gartner reports that server virtualization is already over 60% penetrated and projects the market will be over 80% penetrated by 2016. With this rapid rise in virtualization deployment, customers are challenged to incorporate data protection and archive methodologies for their virtualized data. ESG recently indicated that 60% of virtualization technology users are planning to address challenges associated with data protection for their virtualized data as a top priority for 2013, an astonishing number. There are certainly lots of options available to protect traditional data, but virtualized data is a different beast. Another Gartner report identifies that over 1/3 of organizations will change backup vendors due to factors including cost, complexity, or capability. Based on what I’ve heard talking with customers, I would add one more factor: compatibility. In this article we will explore all four of these areas and suggest ways to overcome the challenges associated with virtualized data protection.