Janae Lee
Janae Stow Lee is Senior Vice President of Strategy at Quantum. Janae was previously Senior Vice President of File System & Archive Products and also served as its Vice President of Marketing. She has over 30 years of experience in software marketing, sales, business development and e served as the President and Chief Executive Officer of TimeSpring Software Corporation.

The Third Era of Data (and Storage) – At the Center of the Universe

Quantum is ringing the closing bell at the New York Stock Exchange today in celebration of its 35th anniversary this year, so I’ve been thinking recently about the history of the storage business. It’s been a very interesting ride. Most extraordinary is the degree to which the role of data storage has changed. Data storage have moved from the edge of the data processing universe to being firmly at the core. Data storage has gone from being a peripheral playing a supporting role to being the belle of the IT ball. It’s an exciting time (again) to be in the data storage industry, particularly for Quantum because of the central role our StorNext scale-out storage solutions have long played in enabling customers to organize, protect and manage their data so they can leverage it for strategic advantage. As Quantum commemorates its 35th anniversary, we look forward to helping more organizations maximize the value of their data.

A Partnership Built on Innovation – Not Just Bits on a Disk

It’s so exciting to see today’s announcement that Quantum will be taking on a larger role in selling Dot Hill storage. This is not just another storage channel deal – let me explain why. The partnership of Quantum and Dot Hill began in workflow storage. Quantum enjoys a unique position in the market in serving the needs of the most demanding workflow storage applications – from video production to surveillance, cybersecurity and research. These applications have storage needs which are quite different from traditional corporate applications. They need different solutions. Both companies have noticed that there are adjacent markets and customer needs where tighter collaboration between us—leveraging the technology of Dot Hill and the technology and channel reach of Quantum—can deliver broader value to our customers and partners. Which brings me to the announcement today: Quantum is becoming the branded face of Dot Hill storage.

Western Digital/HGST Acquires Amplidata: Object Storage is the Place to Be

As you may have heard already, there’s exciting news today in the object storage marketplace: Western Digital Corp., a leader in storage technology, announced that its HGST subsidiary is acquiring Amplidata, Quantum’s object storage technology partner. We’re happy for Amplidata and looking forward to expanded partnership opportunities with WDC and the HGST group. As a reminder, Quantum announced in 2012 that we were leveraging the performance and availability of Amplidata’s object storage technology by embedding it in our Lattus family of unique active archive solutions. Since that time, many Quantum customers have been able to increase the value of their data by extending cost-effective online access to massive volumes (PBs) of information, so let's look at how this announcement is great news for three major reasons.

Choosing Which Secondary Data to Migrate to the Cloud – Part 1: Disaster Recovery

As I noted in an earlier blog, customers planning to move data applications (e.g., backup and archive) to the cloud must consider five key factors in selecting and migrating data. These are: Ongoing data transfer volume, expected frequency of ongoing data usage, data recall performance requirements, and application integration. In the next several blogs, I’d like to illustrate the importance of these factors by illustrating how they impact your design and planning as you migrate a few common data use cases to the cloud. The four use cases we’ll consider are: Site disaster recovery, data center off-site copy (for backup), compliance archive, remote site primary, archive and backup and ongoing management. Let’s start by looking at central site disaster recovery.

5 Issues You Must Consider in Planning to Move Data to the Cloud

Businesses of all size are increasingly starting deployments of cloud-based data, driven by the promises of greater agility, lower management cost and capital savings. It just makes sense. When both compute and data move together – in lockstep – to the cloud, the issues to be considered are very similar to deploying or migrating an onsite application. But when the major compute operations are staying onsite and only the data is moving offsite (such as for backup, disaster recovery or compliance archive), the deployment can be more complex. In this scenario, operational executives must consider five key issues to ensure a successful experience – including meeting customer service level agreements (SLAs) and staying within budget. Based on the use case you are planning, you must consider these 5 issues.

Closing an Exciting Year with a Big Thank You to our Customers and Partners

As we approach the close of the year, it’s natural to take a moment to reflect on the year’s events. This year, there’s more than ever to appreciate. Whether we consider product awards, new solution offerings, compelling new customer installations, or overall business growth, 2014 has arguably been one of the most exciting years in Quantum’s 30+ year history. I want to highlight a few of these – and take a minute to thank you – our customers and partners – for a great year! So let’s review….

10 Things to Know About Cloud Gateways

A couple of weeks ago I was lucky enough to participate in the Cloud IT Expo in Santa Clara, getting a fresh look at what’s new in the cloud. One of the reasons for my being there was to reflect on the massive change that’s occurring around the cloud gateway market. These changes are important to note whether you are a gateway customer today – or not. So for those of you who were NOT at the Expo, here are my Ten Things to Know about What’s Happening with Cloud Gateways.

Meeting Increasingly Difficult Backup Windows

Chances are, if you are having backup problems, your issue is caused by large unstructured data. Among that mass of unstructured data that is making backup difficult, you're likely to find an increasing amount of data that never changes. Industry analysts have observed that by the year 2020, a full 50% of the data passing over the network (and stored somewhere) is going to be video and images. So the question becomes: Are you ready to look at the composition of your unstructured data to see what data has snuck in? And find an easy archiving software you can install to migrate this data OUT of your active data pool and out of your backup data process?

Quantum Acquires the Symform Cloud Storage Platform

Today, Quantum announced the acquisition of the Symform cloud storage platform, serving over 45,000 customers. As a Quantum customer, or prospect, why should you care? For years, customers have relied on Quantum, as a specialist in scale-out storage, backup and archive, to deliver the best workflow and target storage. These target devices have been everything from cost-effective managed Scalar tape libraries, to highly scalable online Lattus object storage, to extremely efficient DXi disk deduplication appliances. In combination with our ISV partners, we’ve delivered workflow, backup and archive solutions with high quality end-to-end support. Survey after survey has shown that customers trust Quantum to deliver the most cost-effective, successful solutions for their storage use cases. But in the new world, there are increasingly cases where customers need the target device to be a cloud service.

Will Legal Ruling on APIs Put Gas in the OpenStack Storage Engine?

Historically, the storage industry, simply put, sucks at agreeing on – and deploying – open standards for anything. This fact makes huge sense when you consider that the “standardized” segments of the storage business (e.g., raw disk storage) have survived for years on razor thin margins per disk, while the software and system value-add that have floated on top of this core architecture have been priced at anywhere from rational margins to excess profits. Nobody wants to give up those margins! Startups need those margins to innovate while the major system and storage houses who exert a level of market control simply love the ROI. Nobody really wants an open standard – unless by some chance it is constructed to allow customers to move off the competitor’s offerings and onto “mine.” This win-lose mentality results in a lot of talk (and meetings) about open standards and products, but very little action. SNIA is a tissue paper tiger. Enter – the cloud.

How is Business Metadata like a Lego Inventory?

As I’ve said in prior posts, keeping data in native format for later use is increasingly a “must have” for many customers. This is the starting point. Stage two is, of course, turning raw data into useful information by adding knowledge or context. Before you can transition data into business information, you also must find the pieces of data that are interesting or useful. In the media and entertainment world, this is done predominantly through a concept called “metadata tagging.” Metadata tagging is a process by which every unique data element (for video, this would be a frame) is enriched with business information likely to identify its value.

Data, Information and Going Native

Lately, I’ve been spending a lot of time exploring the differences between data (as in “Big Data”) and information. There’s a very interesting conceptual model that has been proposed outlining the relationship between data, knowledge, information, understanding, and wisdom (D-K-I-U-W for brevity’s sake) attributed to American organizational theorist Russell Ackoff. For a nice introduction to this model, you can read the article “Data, Information, Knowledge, and Wisdom,” by Gene Bellinger, Durval Castro, and Anthony Mills.

1 2
page 1 of 2