Quantum is ringing the closing bell at the New York Stock Exchange today in celebration of its 35th anniversary this year, so I’ve been thinking recently about the history of the storage business. It’s been a very interesting ride.
Most extraordinary is the degree to which the role of data storage has changed. Data and data storage have moved from the edge of the data processing universe to being firmly at the core.
The first shipment of the IBM 305 RAMAC in 1958 started the digital storage business. This refrigerator-sized box, holding 5 megabits of storage, initiated the first era of data storage. In the first era, processing was king, and disk was used for temporary paging space from memory – to give the processor enough cache to complete a task at an affordable price. Persistent data was physical. Data entered the system on punch cards and left the system on hardcopy printout. Digital data on storage was a necessary, transient evil.
This storage model started to change in the early 1980s with volume shipments of the Winchester disk systems. Costing over $100,000 per GB and still on giant, phonograph-sized platters, disk storage remained a scarce and expensive resource, but it was affordable enough for the most valuable corporate data to be held persistently, starting digital storage in its second era. In this second phase, disk and tape held customers’ data long enough that it could be re-read and re-used by multiple core applications.
Fifteen years later, the second era of data storage had birthed a many-billion dollar hardware and software industry. With continuous density improvements having driven the price for hard drives to $1/GB (and tape costs even lower), persistent storage was affordable for everyone. Data availability was more critical, and with engineers across the industry projecting continued price performance improvements of 60% per year, the storage industry went crazy innovating solutions to make data more accessible and available with networking – connecting users, applications, files, backups and replicas in shared storage networks. Data storage grew to a major line item in the IT budget; and a topic of conversation in the CIOs office. But digital storage was still described as a “peripheral” device in the universe of network-centric IT.
Nevertheless hints of the future were already sneaking into the conversation. In the ongoing debate between IT and the users over who owned the future of business processing, the battle wasn’t over application or even network control: it was always a disagreement over who got to own the data. And there was more data to argue about. With continued increases in density and affordability, storage enabled applications no one had even considered – expanding from financials and productivity applications to personal files and content stored across data centers, personal computers, smart devices and the web. And then, in the 21st century, something really interesting happened. Business owners discovered that – somewhat paradoxically – the more data they kept, the more valuable it could be. Data began to drive the business. This realization has led us to the third era of data storage – the era of data centricity. Data is no longer about supporting the business. Data now is the business, in industries from financial services to retail to media and entertainment.
In this third era, data from customer and supplier interactions provides insight which leads to strategic advantage. A late 2013 study by Bain Research found that top performing companies were almost two times more likely to have made “data-driven” decisions than the non-top performers were. As a result, smart enterprises are increasingly storing more data – and storing it longer – to analyze information in the context of prior events.
Data collected and stored from machines is being used to change the world we live in. Companies like Nest crowdsource data on customer behavior, store it, analyze it and use it to proactively adjust the temperature in individual homes for optimum comfort.
The ability to keep previously created data from experiments in bionomics and seismic processing allows researchers and engineers to discover new realities about the world we live in – by comparing data which may have been collected years ago with new insights of today.
And finally, data itself is being sold (”monetized’). This is not a new concept in industries like media and entertainment – where content (video and images) have been sold for years, but the sale of content has broadened. If you are a media creator, you no longer need a specialist (broadcaster) to take your content to market. Increasingly, content creators and owners like UFC, vice.com and Lynda.com are taking their content to market themselves via the internet. This has caused a flowering of innovation and growth in new video-based communication, entertainment and education. And monetization has moved beyond the realm of media and entertainment. New startup companies – small and large – are collecting data about customers, behaviors, markets and even nature, which they then resell to other companies for their use. In the latest example of this trend, IBM and the parent company of Weather Channel have teamed up to sell real-time and forecasted weather information to other companies so that they can avoid productivity losses due to bad weather.
In this third era, data is at the center of the universe. Data storage has gone from being a peripheral playing a supporting role to being the belle of the IT ball. It’s an exciting time (again) to be in the data storage industry, particularly for Quantum because of the central role our StorNext scale-out storage solutions have long played in enabling customers to organize, protect and manage their data so they can leverage it for strategic advantage. As Quantum commemorates its 35th anniversary, we look forward to helping more organizations maximize the value of their data.