Potentially career limiting self-confession – I’m not a Patriot’s fan. Even after living in Boston for a number of years, I have yet to be in a situation where I have wanted the Patriots to win a single NFL football game. Perhaps it is my natural inclination to support those things in life categorized as […]
More than ever, our culture today seems to be a land of extreme viewpoints. You’re either a Liberal or Conservative, either a Leaver or a Remainer, a climate change believer or think the whole thing is one vast conspiracy, etc, etc. Perhaps it is the age we live in with various algorithms designed to highlight […]
Data is the Most Valuable Asset of the Digital Economy — Better Protect it with Xcellis Scale-out NAS
The quote “Data is to this century what oil was to the last” is one that profoundly resonates with me. In the early 20th century, it was oil that powered the new machines that reshaped both the world economy and political landscape. Fast-forward to the second decade of the 21st century, and the parallels are readily apparent – data is changing how we work, how we live, and just as importantly, the kinds of products and services organizations we are now bringing to market.
I recently had the privilege of participating in this year’s annual iRODS users group meeting in Durham, NC. Aside from interacting with a great group of people, I solidified some of my views on the value companies can get from iRODS deployments, and really clarified how iRODS and Quantum solutions complement each other. I’m definitely excited that Quantum has joined the iRODS consortium and started product testing.
Rapid advances in laboratory instruments are redefining IT requirements for life science organizations. To more quickly and completely analyze the tremendous volumes of data produced by next-generation sequencing (NGS) and cryo-electron microscopy (cryo-EM) instruments, organizations are increasingly investing in additional super computing resources to crunch the larger data sets these instruments are generating on a daily basis.
Transitioning to an All-IP Workflow but Concerned about 4K Streaming Performance? New Test Results May Surprise You.
Like most video production and post-production studios, your organization is probably implementing new tools to meet rising demand for finished content in multiple, high-resolution formats. Perhaps you are upgrading software or buying powerful new editing platforms that can manipulate and master such large files. However, to ensure that your workflow doesn’t have any bottlenecks in streaming 4K content, especially in an all-IP storage environment, understanding what kind of performance your storage infrastructure can deliver should be a high priority.
It’s now been two months after a very rewarding NAB conference, and as the relentless pace of digital transformation in the M&E industry continues to accelerate, Quantum in parallel also hasn’t let our proverbial foot off the gas.
Six Requirements to Ensure Your Storage Environment is Ready to Handle the Rising Demand for Corporate Video
Requests for new video content are multiplying fast. Your company’s marketing group needs you to create product videos and customer success stories to support the next product launch. The events team wants you to post executive keynote addresses from an upcoming conference—preferably within a day after each talk is given. And as your company expands its salesforce, you need to produce a new series of training videos to bring team members up to speed.
I recently found myself in need of a new vehicle. It wasn’t necessarily because my old one was breaking down or in bad shape. My situation had simply changed over time such that the car I now have no longer meets my needs. In my case, I needed more room for a growing family and wanted better gas mileage. Unfortunately, the car wasn’t designed to easily or inexpensively make those improvements. It also started to cost me a lot more to maintain. This got me thinking about the similarities with legacy scale-out NAS solutions.
You may have heard of “High Value Workloads,” but wondered what that actually means. Simply put, they are environments where the data is either being used for strategic decision making for the company on a consistent basis, or, as is often the case, data IS the product itself.
Today, everyone seems to understand the ever-growing importance of data protection, often viewing it as a superset of backup combined with snapshots and replication. Typically, a conversation about data protection includes the assumption of a “gold standard” centered on using secondary disk for rapid recovery and tertiary tape for long term retention. Of course, “the cloud” also is always a consideration as part of the next generation of the solution. It’s still all under the banner of “data protection” (DP), the collection of activities, methods, and media used to help recover or restore business information after a crisis or other IT disruption. According to research, primary storage is growing around 40% annually, with secondary storage used for data protection growing at similar rates. Budgets aren’t growing nearly that much. Meanwhile, IT organizations are being asked to do more (i.e., inject more agility, functionality, and resiliency into their operations) while spending as little budget money as possible. In actuality, data protection budgets are growing around 4.6% annually according to ESG research, but that level of increase won’t even let you keep doing what you have been doing at a larger scale.Therefore, you have to do something different. What you should do: ARCHIVE!
Video editing has always placed higher demands on storage than any other file-based applications, and with today’s higher resolution formats, streaming video content demands even more performance from storage systems, with 4K raw requiring 1210 MB/sec per stream—7.3 times more throughput than raw HD. In the early days of non-linear editing, this level of performance could only be achieved with direct attached storage (DAS). As technology progressed, we were able to add shared collaboration even with many HD streams. Unfortunately, with the extreme demands of 4K and beyond, many workflows are resorting to DAS again, despite its drawbacks. With DAS, sharing large media files between editors and moving the content through the workflow means copying the files across the network or on reusable media such as individual USB and Thunderbolt-attached hard drives. That’s not only expensive because it duplicates the storage capacity required; it also diminishes user productivity and can break version control protocols. In this blog, we'll look the key differences between major storage technologies and well as general usage recommendations.