As the great adage says, if you fail to plan, you’re planning to fail – and when you’re making strides in uncharted territory, it can be difficult to develop a winning strategy off the bat. Recently, Lyft learned that lesson in a very expensive way. They recently released documents for their IPO filing showing that […]
This is the story of Rook, an open source project that was built by many hands, fostered by Quantum Corporation developers and was accepted this week into the Cloud Native Computing Foundation where it can grow and deliver solid storage management capabilities to a much broader community.
On January 12th, we announced our new solution for offsite DR storage using Amazon’s cloud infrastructure – Q-Cloud Protect for Amazon Web Services. There’s a great article about it in eWeek here. Offering a product that runs in Amazon’s cloud infrastructure is a brand new type of offering for us, and I’ve learned a lot over the past year and even in the past few weeks speaking with customers, partners and analysts about this new product. I’ve spent some time speaking with firms of all industries and sizes – private and public universities, small financial services firms, multi-national distributors, government agencies and contractors, and more. Here are some takeaways...
Quantum is ringing the closing bell at the New York Stock Exchange today in celebration of its 35th anniversary this year, so I’ve been thinking recently about the history of the storage business. It’s been a very interesting ride. Most extraordinary is the degree to which the role of data storage has changed. Data storage have moved from the edge of the data processing universe to being firmly at the core. Data storage has gone from being a peripheral playing a supporting role to being the belle of the IT ball. It’s an exciting time (again) to be in the data storage industry, particularly for Quantum because of the central role our StorNext scale-out storage solutions have long played in enabling customers to organize, protect and manage their data so they can leverage it for strategic advantage. As Quantum commemorates its 35th anniversary, we look forward to helping more organizations maximize the value of their data.
As I noted in an earlier blog, customers planning to move data applications (e.g., backup and archive) to the cloud must consider five key factors in selecting and migrating data. These are: Ongoing data transfer volume, expected frequency of ongoing data usage, data recall performance requirements, and application integration. In the next several blogs, I’d like to illustrate the importance of these factors by illustrating how they impact your design and planning as you migrate a few common data use cases to the cloud. The four use cases we’ll consider are: Site disaster recovery, data center off-site copy (for backup), compliance archive, remote site primary, archive and backup and ongoing management. Let’s start by looking at central site disaster recovery.
I recently had the pleasure of participating in the launch of Quantum’s new cloud-related services and products – Q-Cloud Archive, Q-Cloud Vault and Q-Cloud Protect for AWS. I participated by joining some panel discussions with Quantum’s Geoff Stedman, Senior Vice President, StorNext Solutions, and Bassam Tabbara, Executive Director, Cloud Services. One of the over-riding themes in our panel discussion was the idea of putting data where it makes the most sense from a cost and performance perspective. If this sounds a lot like the good --well, not really – old days of hierarchical storage management (HSM), that’s because the basic concept is the same. One of the differences today is that we now have the cloud as another tier in the storage hierarchy.
As ubiquitous as the cloud is today, it has taken time to get there. Even in enterprise IT, you wouldn’t characterize the adoption of cloud services as universal and all-encompassing. Functions that seem obvious and mundane like cloud-based IT storage backup and cloud-based virtualized software development went through periods of careful ROI analysis. So it is with cloud for media and entertainment. If you feel like you’re behind the curve in leveraging cloud in your M&E operations, you’re not alone and it comes down to two factors: ecosystem and workflow.
Businesses of all size are increasingly starting deployments of cloud-based data, driven by the promises of greater agility, lower management cost and capital savings. It just makes sense. When both compute and data move together – in lockstep – to the cloud, the issues to be considered are very similar to deploying or migrating an onsite application. But when the major compute operations are staying onsite and only the data is moving offsite (such as for backup, disaster recovery or compliance archive), the deployment can be more complex. In this scenario, operational executives must consider five key issues to ensure a successful experience – including meeting customer service level agreements (SLAs) and staying within budget. Based on the use case you are planning, you must consider these 5 issues.
The enterprise has been at the center of IT innovation for many decades. With an emphasis on reliability, business continuity, security and ROI, the enterprise has challenged software and hardware vendors to continuously innovate to meet such high demands. Today, we are announcing three more cloud offerings: Q-Cloud Archive, Q-Cloud Vault and Q-Cloud Protect for AWS. We believe that the industry is witnessing a transformational shift in enterprise IT, and we are excited to enable customers to easily and seamlessly add cloud storage to their existing storage and collaboration workflows.
As we approach the close of the year, it’s natural to take a moment to reflect on the year’s events. This year, there’s more than ever to appreciate. Whether we consider product awards, new solution offerings, compelling new customer installations, or overall business growth, 2014 has arguably been one of the most exciting years in Quantum’s 30+ year history. I want to highlight a few of these – and take a minute to thank you – our customers and partners – for a great year! So let’s review….
A couple of weeks ago I was lucky enough to participate in the Cloud IT Expo in Santa Clara, getting a fresh look at what’s new in the cloud. One of the reasons for my being there was to reflect on the massive change that’s occurring around the cloud gateway market. These changes are important to note whether you are a gateway customer today – or not. So for those of you who were NOT at the Expo, here are my Ten Things to Know about What’s Happening with Cloud Gateways.
Recent very public incidents involving residents and police have sparked a conversation of the value of equipping police with on-body video surveillance—not for security monitoring, but to provide law enforcement and citizens with a single source of truth. Cambridge University recently completed a study of the police department in Rialto, California—a city of about 100,000—where they saw an 89% reduction in the number of complaints against officers in a year-long trial using body cameras. Without accurate video evidence taken at the point of an incident, it becomes almost impossible to know what really happened. And in the absence of visual proof, assumptions run wild and events can spiral out of control.
Since they were first introduced over a decade ago, shared storage solutions have been the industry standard for sharing content in media workflows, enabling teams to collaborate more effectively. With shared storage, multiple users can directly access content across a SAN connection at the same speed as direct-attached storage. Shared storage has the power to stream high-resolution content to team members at rates high enough to never drop a single frame. That’s where a fully-featured management platform like StorNext Connect comes in. StorNext Connect allows you to install, manage and monitor all your StorNext shared storage resources—storage, CPU, memory, network—in an intuitive graphical view.