A couple of weeks ago I was lucky enough to participate in the Cloud Computing Expo in Santa Clara, getting a fresh look at what’s new in the cloud. One of the reasons for my being there was to reflect on the massive change that’s occurring around the cloud gateway market. These changes are important to note whether you are a gateway customer today – or not. So for those of you who were NOT at the Expo, here are my “Ten Things to Know about What’s Happening with Cloud Gateways.” And because a number of the ten items are relatively meaty, my intention is to follow up with more details throughout the coming weeks.

1. Based on the money being spent in funding and acquisitions around the cloud gateway segment of the market, it’s clear that something really important is happening.

2. Two major waves of gateway innovation have already occurred in a very short time: the wave of cloud enablement; and the wave of use case customization.

3. While the first wave was about eliminating customer concerns in putting data in the cloud, in the second wave each Gateway vendor specialized on particular segments and customer needs, delivering features optimized for particular use cases. This has created a diverse set of solutions which customers need to understand in order to choose the right solution.

4. Gateways also needed to integrate more tightly into enterprise customer environments – this has resulted in gateways adopting characteristics from other areas of the data storage stack – converging both up and out. Gateways became ‘controllers’.

5. To move data in bulk, or from many endpoints, Gateway-Controller vendors added features to increase network and storage efficiency, but this didn’t really address the growing challenge of data management.

6. Data management is the challenge, and it’s getting harder, due to data growth and the customer need to control data – regardless of where it is stored, or who is managing its storage.

7. Data growth isn’t just about capacity – it’s increasing endpoints, exploding numbers of objects and the issues introduced by rapidly expanding use of video as a communication tool.

8. The data management challenge can only be solved by a new level of data management automation – this automation (which Quantum calls policy-based-tiering) must flexibly, reliably and automatically copy, migrate and maintain data where it’s needed when it is needed without human analysis or intervention.

9. To achieve real management efficiency, policy based tiering must be customized by individual business (not storage) policy and integrated into the business process. This is not about vendors optimizing boxes for a broadly deployed use case; it’s about custom enablement of the individual enterprise’s needs.

10. The resulting data controller will be an integrated feature in a policy based data framework managing data which is onsite and offsite, privately managed or in a public cloud service. At this point, it won’t be just a gateway – or even a controller – anymore.

I look forward to exploring several of these topics with you – and getting your feedback – over the coming weeks.

Learn More

Customer Story: See how redIT created private, customizable clouds with Quantum.

Recommended Posts

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.