It’s always nice to see objective, fact-based articles to help users make the right buying decisions. This article published last week by Storage Switzerland has some great facts about how to compare tape to other storage technologies in terms of data integrity, bandwidth, and even cost considerations.
In the age of the exploding Galaxy S7 and Spotify cyber attacks, I’d like to take a moment to salute a workhorse in my personal technology stable — my iPod. Since 1998, this little beauty has been humming along with the same battery (!) on ski days, backpacking trips, trail runs, bike rides, and cross-country flights. It carries over 4,000 songs hand-curated, primarily from the (ahem) Boulder Public Library. It helps me save the battery life on my iPhone and keeps me in music when I’m out of reach for Pandora or Spotify. Yes, the Amazon Echo is amazing, but the content on my old iPod still has value, and there’s still a place for it in my multi-tiered music strategy. You have to appreciate technology that just works.
Collection and analysis of large data sets is perennially hot. Remember Data Warehouses? ‘Big Data’ is just the latest buzzword for this trend. Admit it - it’s an alluring vision. Supposedly just save enough data and apply the right tools, and insight (and money) will rain from the clouds. Though frequently clothed in breathless hype, there is a kernel of truth here. You can find insight in rivers of data if you have the right tools. Organizations across a range of industries are successfully capturing and analyzing oceans of machine- and sensor-generated data with Splunk.
Many companies are getting caught up in the hype of moving to the cloud, and in their initial pursuit they discover some of the hidden issues and costs that are otherwise not obvious. There are many services of great value in the public cloud: software, storage, infrastructure, and more, and the development of these services has triggered a rush to the cloud. However, just because we can outsource these services doesn’t mean that we always should, as noted in a recent article.
When we set out to do a lab validation of the Artico active archive appliance with industry analyst ESG, it felt like we were entering somewhat uncharted territory. We’ve done plenty of lab validations with ESG before – primarily with various models of DXi – but Artico is a different animal, it occupies a different place in the data center, and it breaks with so many traditional approaches to data archive, we had to wonder if ESG would “get it.”
Last week the Active Archive Alliance announced the availability of a report titled “Active Archive and the State of the Industry”. The report is primarily an educational piece, explaining the data growth challenge IT organizations are facing today, and then defining archive characteristics, showing how active archives are implemented and illustrating the resulting benefits.
On March 23rd, Storage Newsletter published an article that referenced the amount of LTO storage capacity shipped in 2015, and that the LTO capacity shipped actually increased by about 18% versus the prior year. These figures are based on a report that was published by the LTO consortium, and the report also indicated that more than 385,000 PB of total data capacity has been shipped since the introduction of LTO Ultrium cartridges in 2000.
OK, perhaps not as colorful as Shakespeare’s original phrase, but in today’s world of data and content proliferation the term archive has suffered tremendous abuse and misunderstanding. This would not be a problem for the reader if vendors and marketers of storage technology products and solutions did a better job of steering the marketplace with well-defined terms that truly meant what they sounded like.
When we received word that Storage Magazine had named Artico as a finalist to its 2015 Product of the Year Awards in the Backup Hardware category, it opened a bit of an existential debate. Backup Hardware? Isn’t there a more appropriate category for an NAS appliance designed to provide intelligent tiering to optimize performance and cost of archive storage? Apparently not, but that shouldn’t be a surprise. We knew at introduction that Artico was carving out a unique niche that takes a fresh approach to data retention and accessibility.
Often, when people use the term ‘archive’ it means many different, often erroneous, things: Some refer to any time that tape is used, instead of disk, to be an ‘archive’ – compared with a ‘backup’. Some refer to ‘archiving’ as the grooming or moving of data from primary storage. Some refer to any long-term retention mechanism (greater than a year) as an archive, even if the copies of data were originally created by a backup application or just a drag-and-drop of a folder or other object
Storing data is easy (well, not that easy). But turning it into meaningful business value requires technology partners that know how to integrate well and create something bigger. Customers want products that can create a real solution to a real problem. That’s the Quantum approach and that’s why we’re so excited for this year’s VMworld. We’re “Kicking the Cartel” and helping companies break free from traditional, old school ways of storing and protecting data. If it’s time for your organization to break free from paying $2,500 per TB for backup software, stop by the Quantum booth at VMworld. We’ll show you how to create a different approach to storage that centers on the highest performance and the lowest TCO. We’ll be showcasing all of our technology at VMworld too, including our QXS hybrid storage for VM primary storage and our Artico NAS appliance for archiving. Sign up to meet with the Quantum Storage Experts at the show and you could win an Apple Watch too.
Today, everyone seems to understand the ever-growing importance of data protection, often viewing it as a superset of backup combined with snapshots and replication. Typically, a conversation about data protection includes the assumption of a “gold standard” centered on using secondary disk for rapid recovery and tertiary tape for long term retention. Of course, “the cloud” also is always a consideration as part of the next generation of the solution. It’s still all under the banner of “data protection” (DP), the collection of activities, methods, and media used to help recover or restore business information after a crisis or other IT disruption. According to research, primary storage is growing around 40% annually, with secondary storage used for data protection growing at similar rates. Budgets aren’t growing nearly that much. Meanwhile, IT organizations are being asked to do more (i.e., inject more agility, functionality, and resiliency into their operations) while spending as little budget money as possible. In actuality, data protection budgets are growing around 4.6% annually according to ESG research, but that level of increase won’t even let you keep doing what you have been doing at a larger scale.Therefore, you have to do something different. What you should do: ARCHIVE!