Choosing Which Secondary Data to Migrate to the Cloud – Part 1: Disaster Recovery

As I noted in an earlier blog, customers planning to move data applications (e.g., backup and archive) to the cloud must consider five key factors in selecting and migrating data. These are: Ongoing data transfer volume, expected frequency of ongoing data usage, data recall performance requirements, and application integration. In the next several blogs, I’d like to illustrate the importance of these factors by illustrating how they impact your design and planning as you migrate a few common data use cases to the cloud. The four use cases we’ll consider are: Site disaster recovery, data center off-site copy (for backup), compliance archive, remote site primary, archive and backup and ongoing management. Let’s start by looking at central site disaster recovery.

Backup is Busted – Live from VMworld 2014

We’re really excited for VMworld this week - once again one of the best opportunities to showcase our technologies and the ways that we are helping customers rethink how they protect VMware data, as well as other data types. At VMworld, we’ll be talking about how “Backup is Busted.” What does that mean? Backup is Busted means that batch backup is not designed for protecting virtual servers, or other specialized, challenging data like large unstructured file content. When it comes to virtual servers and VMware data, we’ll be showcasing our vmPRO and DXi-V deduplication technology, which uses a unique approach to protecting VM’s. Rather than the traditional, network-and-resource intensive approach of doing batch backups in proprietary formats, vmPRO snaps VMs in native format, first prepping the VMs for maximum data reduction and network savings. vmPRO uses the native tools available in VMware, instead of treating VM’s like physical servers.

Data, Information and Going Native

Lately, I’ve been spending a lot of time exploring the differences between data (as in “Big Data”) and information. There’s a very interesting conceptual model that has been proposed outlining the relationship between data, knowledge, information, understanding, and wisdom (D-K-I-U-W for brevity’s sake) attributed to American organizational theorist Russell Ackoff. For a nice introduction to this model, you can read the article “Data, Information, Knowledge, and Wisdom,” by Gene Bellinger, Durval Castro, and Anthony Mills.

MSP Momentum: Powered by Quantum Cloud Backup

In Newtonian mechanics, momentum has a direction as a well as magnitude. If Newton was correct, and I am going to go out on a limb here and assume that to be the case, then the Powered by Quantum MSP Program has momentum, with positive direction and high magnitude. Over the past couple of weeks, Quantum has successfully created partnerships with a number of MSPs that deliver their own cloud backup service powered by Quantum technology. Just this month we added two new MSPs to the roster, Elanity Network Partner and Interconnekt. These partners, scattered across the globe, have recognized the benefits that Quantum solutions can bring to not only their customers but also to their bottom line.

Beyond the Marketing: What “reTHINK Your Backup and Archive” Really Means (PT.2)

Why do companies continue to store that data on their most expensive, highest performance storage? A better approach – and a way companies can completely rethink their backup and archive approach to unstructured data – is to employ a tiered storage approach. Quantum is a specialist in designing tiered storage solutions for unstructured data – we’ve been doing it for years in the most demanding data environments like M&E, Government, and Oil and Gas. So we can use some of our core technologies and our core approach, to design tiered storage solutions for the data center.

What Every CIO Needs to Know About Data Protection for Virtual Environments

IT departments today are rapidly deploying virtualization technologies – in fact, Gartner reports that server virtualization is already over 60% penetrated and projects the market will be over 80% penetrated by 2016. With this rapid rise in virtualization deployment, customers are challenged to incorporate data protection and archive methodologies for their virtualized data. ESG recently indicated that 60% of virtualization technology users are planning to address challenges associated with data protection for their virtualized data as a top priority for 2013, an astonishing number. There are certainly lots of options available to protect traditional data, but virtualized data is a different beast. Another Gartner report identifies that over 1/3 of organizations will change backup vendors due to factors including cost, complexity, or capability. Based on what I’ve heard talking with customers, I would add one more factor: compatibility. In this article we will explore all four of these areas and suggest ways to overcome the challenges associated with virtualized data protection.

Quantum’s DXi V1000 Wins Virtualization Product of the Year

When I learned that Quantum had picked up two honors at the 2012 Storage, Virtualization and Cloud Computing (SVC) Awards in London this year, my first thought was that no one carries off wearing a tux quite like the British. Look how much fun they’re having! In contrast, most American males view the tux as a form of punishment, and we rent rather than buy them because, for many of us, our waistlines are likely to change between tux-worthy events. The recent SVC honors Quantum received went to the DXi V1000 virtual deduplication appliance, which was awarded “Virtualization Product of the Year.” Additional accolades went to Quantum’s StorNext data management software in the Storage Software Appliance category. Sounds like it was quite a party.