Contrary to popular belief, how you archive matters more than what or why you archive. For the broad market, the notion of non-archived data has become antiquated. Getting rid of old data means taking the time or investing in resources required to decide what data can be deleted, and most data managers do not feel comfortable making those decisions. So today virtually everything is being stored forever, generating huge repositories of data and content, and creating a great urgency to establish a data storage architecture that will thrive in this new “store everything forever” era.
This article originally appeared on Wired Magazine’s Innovation Insights. With the start of the new year, it’s time once again for those of us in enterprise storage to look ahead and offer our predictions for what the industry will see in 2014. So without further ado, here are ten trends that will have a big impact in the coming year.
I know a lot of folks think the big contest this time of year is the Super Bowl playoffs. In Quantum’s Denver, Bay Area and Seattle offices we’re sporting the colors of the Broncos, 49ers and Seahawks, with just a bit of friendly trash talk to kick off conference calls. Perhaps you know someone rooting for New England – I don’t. But if you care about data storage, the other big contest is Storage Magazine’s Product of the Year Awards. The award serves as an annual reminder of what the storage community found important, promising, and profitable. This year’s award finalists include a cross-section of Quantum products spanning scale-out shared storage and the data center, highlighting the breadth of innovation from the company over the last year. For 2013, four Quantum products – more than any other vendor among the finalists – have been selected in three award categories.
Let’s face it: primary storage vendors love anyone who will keep infrequently accessed data on primary storage. These customers are like money in the bank, and the last thing a primary storage vendor wants is for those customers to wise up and break the chains that bind enterprises to their current storage investment model.
Video has long been a staple of sports—it’s used for scouting, game prep, recruiting and promotion, as well as broadcasting—but two factors are making it different now. One is the technology for analyzing what’s on video. One of the biggest trends in sports is motion analysis, which combines special analytics applications and video to help players improve their performance by breaking down and evaluating their movements. In addition to motion analysis technology, there are also now programs that can comb through years of footage to find trends based on very sophisticated variables—specific players, coaches, and situations—and extract clips to let coaches and players analyze tendencies and put together a game plan.
Astronomers searching for life outside of our solar system speak of The Goldilocks Zone – the region around a star where conditions are suitable for sustaining life: not too close and hot, and not too distant and cold. Initially these “just right” conditions appeared to be almost impossibly rare, but researchers over the years have found organisms that can exist in more conditions than previously imagined. It turns out that the Goldilocks Zone is wider than we thought, increasing the possibility of finding other planets capable of sustaining life. Today a similar recognition is happening in data centers. While IT has long thought of data storage as “hot” and requiring immediate access in flash memory or primary disk, or “cold” and suitable for backup and archive to tape, there weren’t many choices for a “warm” tier of data that required a more nuanced cost/latency balance. The expanding range of choices such as public and private cloud, object storage and LTFS tape has in effect created a wider Goldilocks Zone for data centers. The refreshed thinking about the capabilities of both established and emerging technologies for these different tiers of storage has been getting a lot of attention lately.
Few things come along that alter the world of filesystems and make them exciting, especially for folks in the Media and Entertainment industry. Especially with the multitude of distractions, tight turn-around schedules, and little budget. That’s why Quantum dove deep into their treasured StorNext product to revolutionize and re-invent what can be considered the most modern and developed shared filesystem to date: StorNext 5.
With the introduction of cloud there has been a lot of talk, including jokes about how one can get started in the cloud. We see customers all the time trying to figure out what they can do from a cloud strategy perspective and how this will impact (positively or negatively) their current infrastructure, mainly around budgets. Cloud certainly has the ability to provide some relief in terms of finances – allowing you and your team to focus on more strategic projects – so why not get started with using cloud technologies, particularly when it comes to backup. Quantum recently announced a cloud based backup program for MSPs and VARs that delivers a number of fantastic benefits. Read more to learn more.
I read a good article in SearchDataBackup recently on an interview Sarah Wilson conducted with Jon Toigo on LTFS (Linear Tape File System). LTFS is an open standard technology that allows you to use tape like NAS – drag and drop files to and from the tape, quickly access them from a directory on your screen, easily exchange them between different operating systems and software, etc. The interview provides a good overview of LTFS and where it’s being used, and he also shoots down some of the misconceptions about tape that I often hear.
As the volume of data has increased, there has been a shift in the way that companies use and access that data. That means it’s time to change the way you think about data protection, retention, and accessibility. Organizations of all sizes recognize that data can help gain competitive advantages and even support new revenue streams, but this is placing a demand on IT to store and preserve access to that data. Companies need new solutions and technologies to support unpredictable, on-demand access and incorporate new approaches to backup and archiving. It’s time to reTHINK Backup & Archive.
Since we announced our next generation StorNext 5 Appliances three weeks ago, we’ve been getting requests for more background about how we’ve achieved such significant increases in performance, scalability and flexibility. To dive deeper into how we built StorNext 5 from the ground up, I’ve tapped Skip Levens, director of technical marketing, to detail some of its core design features.
The creation and acquisition of massive amounts of content has become easier than ever. With the introduction of new digital acquisition technologies (from video cameras to sensors) and increasingly sophisticated data analysis tools, the way we handle and save our data is changing. The true value of information will evolve over time. For example, real-time data and historical data can reveal unexpected results. Old video footage can be compiled and digitized from archives to capture a previously insignificant moment in time. For businesses that rely on data to identify trends or repurpose content for monetization, there is a need to keep all of this forever.
A federal jury in Seattle recently ruled for Microsoft in a patent dispute with Google’s Motorola Mobility division, closing off a summer in which patents have been a hot topic. The continuing Apple-Samsung battle has attracted a lot of attention, and President Obama’s proposals for cracking down on patent trolls are being followed closely in the technology, legal and VC communities. It’s the issue of patent trolls that I want to focus on here. These are companies that exist solely for the purpose of buying patents and then suing others for infringing on “their” technology. A few months ago, Quantum had a resounding legal victory against a patent troll, and it’s a good example of how absurd these lawsuits can be.