Almost daily there is a story about the demand for body worn video devices to be used by law enforcement agencies across the globe. There is also a lot of discussion about the Federal funding available to US police departments for this new camera technology – in May this year the Justice Department announced $20 million in grants, towards the $75 million the Obama administration requested over three years. What this funding doesn’t cover - and the biggest challenge faced by agencies in implementation - is how to build out a storage infrastructure to manage and protect the vast amount of data these devices produce. This challenge is compounded by new devices that support higher resolutions and are used by increasing numbers of officers per department. There is a solution, which if implemented can help departments across the globe speed up the adoption of this technology and valuable tool. Quantum's Wayne Arvidson, Vice President of Surveillance Solutions, recently sat down with Tom Temin on his “Federal Drive” program for Federal News Radio to discuss how to solve the challenges of body worn devices.
If you were building a police department from the ground up, where would you begin? Where do you go to stock up on holsters, handcuffs, badges, flashlights, guns, dispatch centers, in-car computers, police cars and the myriad other gear required by modern law enforcement? A good place to start is the Police Security Expo, held this year in Atlantic City. It’s like a superstore for police. And Quantum was there, because police departments increasingly need to include storage on their shopping list. This was my chance to check out the latest on-body cameras that have been in the news so much lately. Think of something about the size of a GoPro, but even more heavy duty, and sophisticated enough to actually begin recording 10 seconds before you press Record. The vendors selling these cameras typically had a good crowd of officers being educated on what it’s like to live with them on a daily basis, and the question of storage always came up. It’s a good thing, because law enforcement agencies are routinely generating over 1PB of data a year.
If you’ve worked in storage for decades as I have, you’ve heard all the debates about which storage works best for each step in media workflows. But one thing that’s clear is that not every step has the same storage requirements, and that some kind of tiered storage strategy is needed. With every-expanding digital asset libraries, storing it all on high-performance disk isn’t practical or cost-effective. Traditional tiered storage is straightforward: store the most active, most recently used data on fastest, expensive disk storage, and store the less active, older data on slower, less expensive storage, generally tape or lower cost disk arrays. Hierarchical storage management (HSM) software was built to automate the data migration between tiers, and make file system access transparent regardless of where the data is stored. When the primary storage filled to a capacity watermark, for example, 90% of capacity, the HSM system would find the files that were least recently used and move them to the secondary tape tier until the disk storage had sufficient available capacity. This model of tiered storage was built for business data where the focus was containing costs. Disk storage was expensive, tape was cheap, and older business data was rarely relevant except for an occasional audit. The criteria was simply performance vs cost. But media workflows don’t manage business data. Here are the 3 biggest considerations for developing a new approach to workflow storage.
Contrary to popular belief, how you archive matters more than what or why you archive. For the broad market, the notion of non-archived data has become antiquated. Getting rid of old data means taking the time or investing in resources required to decide what data can be deleted, and most data managers do not feel comfortable making those decisions. So today virtually everything is being stored forever, generating huge repositories of data and content, and creating a great urgency to establish a data storage architecture that will thrive in this new “store everything forever” era.
A couple weeks ago I worked with our Big Data team to put together an Archiving and Tiered Storage webinar. One suggestion for the webinar title was “So Much Storage, So Little Time” because of all the technologies people need to think about when piecing together a comprehensive data storage strategy that often includes primary storage, backup storage and long-term archive storage. Indeed, there are many exciting technologies to consider ranging from solid state, LTFS, Object Storage, intelligent tape vaulting, cloud-based backup and more. As technologies continue to develop, blurring the distinction between traditional use cases like backup and archive, it can be difficult to get clarity on the best strategy and the best technologies to meet your near-term andlong-term data retention and archiving requirements. Rather than fueling the confusion profusion, we decided to offer some straightforward guidance by titling the webinar “4 Key Considerations for Archiving.” Here are a couple tidbits from the webinar that are good reminders for anyone managing data growth and long-term storage.