Recently, Quantum has had some success displacing Oracle tape libraries with Quantum Scalar® tape libraries. Quantum’s focus on tape and continued investment in storage technologies—like our new Scalar i3 and Scalar i6 tape libraries with the best storage density, and the introduction of LTO-8 this quarter—have driven customers to switch and save costs.
Yes – you saw that correctly. The LTO Program Technology Provider Companies (of which Quantum is one of three TPCs) has published their updated road map and it shows a stunning potential of 120 TB per single cartridge for generation 10 of LTO technology. That is 600 times the capacity released for the first generation of LTO technology. The road map announcement is timely as the IBC show takes place this week and the theme of that show is “Content Everywhere”. While IBC (International Broadcasting Convention) is a vertically oriented event (broadcast vertical), the theme is relevant across many industries. How many of you are not in the broadcast industry but are experiencing a huge swell in the amount of content under management in your own organization?
Contrary to popular belief, how you archive matters more than what or why you archive. For the broad market, the notion of non-archived data has become antiquated. Getting rid of old data means taking the time or investing in resources required to decide what data can be deleted, and most data managers do not feel comfortable making those decisions. So today virtually everything is being stored forever, generating huge repositories of data and content, and creating a great urgency to establish a data storage architecture that will thrive in this new “store everything forever” era.
Let’s face it: primary storage vendors love anyone who will keep infrequently accessed data on primary storage. These customers are like money in the bank, and the last thing a primary storage vendor wants is for those customers to wise up and break the chains that bind enterprises to their current storage investment model.
I read a good article in SearchDataBackup recently on an interview Sarah Wilson conducted with Jon Toigo on LTFS (Linear Tape File System). LTFS is an open standard technology that allows you to use tape like NAS – drag and drop files to and from the tape, quickly access them from a directory on your screen, easily exchange them between different operating systems and software, etc. The interview provides a good overview of LTFS and where it’s being used, and he also shoots down some of the misconceptions about tape that I often hear.
The NAB Show in Las Vegas is a super exciting event. Just getting to see massive 4K HD screens and the way in which the broadcasting industry is pushing data storage technology is really awesome and fun. At the show, I was talking to a small post production company about our Scalar LTFS and Scalar library solutions for data storage in their workflow, and the editor’s comment was, “This is perfect. This will let us go tapeless.” I totally did a double-take before realizing that he was talking about analog tape. It’s kind of interesting to realize that in the M&E space, they don’t even think about LTO technology as “tape” – which in the IT world is a word that carries a lot of baggage. Instead, they view it like it should be viewed across the industry – as a really good (the best?) storage medium for long-term storage.
With LTFS, you can write data to tape without being tied to the backup application that wrote them there. Granted, backup applications provide a lot of value in terms of revisions management, cataloging, etc., but many people I have spoken with are just looking for a simple way to take advantage of tape storage, and keep their data as independent as possible. LTFS is perfect for this. Now, back to the original question: using LTFS as an archive in the extra slots of a library. Not only is this a perfect use case for LTFS – using the open standard for digital asset storage and archive – it is also a great way to get the most out of a tape library by using some slots for traditional backup, and other slots for archive or active-archive storage.
A couple weeks ago I worked with our Big Data team to put together an Archiving and Tiered Storage webinar. One suggestion for the webinar title was “So Much Storage, So Little Time” because of all the technologies people need to think about when piecing together a comprehensive data storage strategy that often includes primary storage, backup storage and long-term archive storage. Indeed, there are many exciting technologies to consider ranging from solid state, LTFS, Object Storage, intelligent tape vaulting, cloud-based backup and more. As technologies continue to develop, blurring the distinction between traditional use cases like backup and archive, it can be difficult to get clarity on the best strategy and the best technologies to meet your near-term andlong-term data retention and archiving requirements. Rather than fueling the confusion profusion, we decided to offer some straightforward guidance by titling the webinar “4 Key Considerations for Archiving.” Here are a couple tidbits from the webinar that are good reminders for anyone managing data growth and long-term storage.