Your organization is collecting more and more data. Whether you are producing ultra-high-definition videos, analyzing human genomes, or using satellite imagery to study climate change, your growing data volumes are central to your work. Protecting and preserving that data must be a top priority.
I recently had the privilege of participating in this year’s annual iRODS users group meeting in Durham, NC. Aside from interacting with a great group of people, I solidified some of my views on the value companies can get from iRODS deployments, and really clarified how iRODS and Quantum solutions complement each other. I’m definitely excited that Quantum has joined the iRODS consortium and started product testing.
May and June have been an active time for Quantum’s technical workflow solutions team as we wind down from the Bio-IT World Conference in Boston and prepare for the ISC high-performance computing (HPC) show coming up later this month in Frankfurt.
Self-driving cars. Self-filling pantries and refrigerators. Been there, done that. Commercial flights to the moon. Coming soon. Cure for Zika virus. That would be awesome. Computers with personalities. Finding life on other planets. Scary. Global monetary system. Accurate weather prediction. Novel concepts. Pokémon GO. Yawn.
Many companies are getting caught up in the hype of moving to the cloud, and in their initial pursuit they discover some of the hidden issues and costs that are otherwise not obvious. There are many services of great value in the public cloud: software, storage, infrastructure, and more, and the development of these services has triggered a rush to the cloud. However, just because we can outsource these services doesn’t mean that we always should, as noted in a recent article.
OK, perhaps not as colorful as Shakespeare’s original phrase, but in today’s world of data and content proliferation the term archive has suffered tremendous abuse and misunderstanding. This would not be a problem for the reader if vendors and marketers of storage technology products and solutions did a better job of steering the marketplace with well-defined terms that truly meant what they sounded like.
People and the companies they work for hoard data - it's a fact brought out in survey after survey. Hoarders are not always proud of their habit and are often curious about the options available. Contrary to popular belief, in many cases it is OK to hoard data. Sometimes it is necessary, and in many cases the data being saved can be of great value to the company. Having clarity on the purpose and requirements in your own organization will provide insight into best practices for maximizing the value of the content you keep with the greatest efficiency. There are four hoarder personas: Pacifist, Captive, Opportunist and Capitalist. Take a look and decide which of these best describes your situation, then get ideas on best practices and technologies for your situation.
I consider the attention industry analysts pay to emerging technologies to be an interesting barometer for the market. Not long ago I attended the Next Gen Storage Summit, where object storage was a key focus, and met with a long list of industry luminaries to discuss object storage and where it is headed. Lots of probing discussions about Lattus, as well as observations about use cases for various industries, that stand to benefit from more cost efficient, scalable and accessible storage. They have also echoed a sentiment consistently: Demand for capacity growth is real industry wide and there is clearly a mix shift toward unstructured content that is driving this.
Yes – you saw that correctly. The LTO Program Technology Provider Companies (of which Quantum is one of three TPCs) has published their updated road map and it shows a stunning potential of 120 TB per single cartridge for generation 10 of LTO technology. That is 600 times the capacity released for the first generation of LTO technology. The road map announcement is timely as the IBC show takes place this week and the theme of that show is “Content Everywhere”. While IBC (International Broadcasting Convention) is a vertically oriented event (broadcast vertical), the theme is relevant across many industries. How many of you are not in the broadcast industry but are experiencing a huge swell in the amount of content under management in your own organization?
Contrary to popular belief, how you archive matters more than what or why you archive. For the broad market, the notion of non-archived data has become antiquated. Getting rid of old data means taking the time or investing in resources required to decide what data can be deleted, and most data managers do not feel comfortable making those decisions. So today virtually everything is being stored forever, generating huge repositories of data and content, and creating a great urgency to establish a data storage architecture that will thrive in this new “store everything forever” era.
Let’s face it: primary storage vendors love anyone who will keep infrequently accessed data on primary storage. These customers are like money in the bank, and the last thing a primary storage vendor wants is for those customers to wise up and break the chains that bind enterprises to their current storage investment model.
With LTFS, you can write data to tape without being tied to the backup application that wrote them there. Granted, backup applications provide a lot of value in terms of revisions management, cataloging, etc., but many people I have spoken with are just looking for a simple way to take advantage of tape storage, and keep their data as independent as possible. LTFS is perfect for this. Now, back to the original question: using LTFS as an archive in the extra slots of a library. Not only is this a perfect use case for LTFS – using the open standard for digital asset storage and archive – it is also a great way to get the most out of a tape library by using some slots for traditional backup, and other slots for archive or active-archive storage.
A couple weeks ago I worked with our Big Data team to put together an Archiving and Tiered Storage webinar. One suggestion for the webinar title was “So Much Storage, So Little Time” because of all the technologies people need to think about when piecing together a comprehensive data storage strategy that often includes primary storage, backup storage and long-term archive storage. Indeed, there are many exciting technologies to consider ranging from solid state, LTFS, Object Storage, intelligent tape vaulting, cloud-based backup and more. As technologies continue to develop, blurring the distinction between traditional use cases like backup and archive, it can be difficult to get clarity on the best strategy and the best technologies to meet your near-term andlong-term data retention and archiving requirements. Rather than fueling the confusion profusion, we decided to offer some straightforward guidance by titling the webinar “4 Key Considerations for Archiving.” Here are a couple tidbits from the webinar that are good reminders for anyone managing data growth and long-term storage.