I f you’re one of the many who have taken a “wait and see” attitude to virtual reality (VR) content creation, now may be the time to jump on board. Advocates who say VR won’t go the way of 3D TV were given more reasons to cheer last week — NBC announced it will provide 85 hours of 360° video from the 2016 Olympics in Rio next month.

In a VR experience there is no fixed screen for the viewer, so the viewing experience is more visceral and engaging. Viewers can look around them, just as a person or a character in the content can. Imagine how VR could enhance the thrill of Olympic sports or video gaming. Consumer interest in VR is on the rise, with content for travel, education, and live events ranking higher than gaming.

Creating an immersive, VR or 360º, viewer-driven experience may arguably be one of the most profound changes in the 100-plus years of moving-image storytelling. Editors and artists no longer have the luxury of transporting viewers between scenes with a cut or dissolve, or using wide shots and close-ups to tell a story. Instead, editors work with a 360° sphere or rectangle Transitions are initiated by viewers and require authoring interactivity mechanisms into the VR streams. And massive amounts of blindingly fast storage are required.

Required: Concurrent Access to Massive Files

If your only experience with VR has been 360º video from cardboard iPhone or Android head-mounted displays (HMDs), you probably found the experience engaging, but may have been disappointed with the resolution. The resolution for mobile phone-based VR 360 video is 2K or 4K, which is not as high quality as it sounds. Because of the limited power of mobile phones to refresh pixels in a VR environment, the field of view (FOV) is limited to either 640×320 or 1.3K, which is the total resolution of the equirectangular image prior to being morphed into a VR sphere. The screen is much closer to the eye too, so pixels are more visible.

More powerful, PC-driven headsets like the Oculus Rift can display 4K FOV 360° video; that’s 4K in every direction the HMD looks. This is where VR is going—and where the storage requirements begin to climb. Minimally, six cameras are required for most 2K and 4K VR 360° video work. On top of that, 60 frames per second (fps) is the minimum requirement for video; for gaming 90 fps or higher is recommended. Lower frame rates can cause HMDs to drop frames — or even worse — cause viewers to experience motion sickness. However, 4K FOV and stereoscopic 3D can require 10 to 14 cameras.

A 4K FOV equirectangular image is a whopping 12,288 x 6,144 pixels. Even with very conservative ProRes 4:2:2 compression, the required data rate is still upwards of 3GB/s for each workstation.

But wait, there’s more…

No Latency Please

In traditional post, a shared storage environment is designed for the possibility that the same file will need to be accessed by multiple workstations at the same time. But in a VR workflow, there is a much higher degree of simultaneous file access — and for much longer periods. Multiple artists and editors collaborate to stitch together footage from multiple cameras. Seams must be hidden, unwanted crew or equipment removed, and shots must be stabilized and color graded. That theoretical 3GB/s must be available to multiple workstations.

This is the kind of collaborative, high bandwidth, low latency production that a Fibre Channel storage area network (SAN) was built for — particularly one powered by StorNext® (see “Metadata Management” sidebar). In a network attached storage (NAS) file server environment, video frames must first be encapsulated in IP. Even using 40GbE, traffic at the switch would be such that chance of latency would be very high, so to disrupted workflows.

A SAN is a purpose-built shared file system; it is not a file server like an NAS. A SAN volume appears to a workstation as an extension of its own file system, so it can write quickly at the block-level of the SAN storage, forgoing the latency of IP-based file-level operations required in a NAS. Today, Xcellis™workflow storage powered by StorNext is delivering 3GB/s or higher to multiple workstations with ease.

VR at SIGGRAPH

Felix & Paul Studios in Montreal has created some of the most stunning VR content to date. Their Quantum Xcellis workflow storage provides unfettered Fibre Channel access to massively large VR files for multiple workstations.  Quantum is featuring Felix & Paul Studios’ VR programming at our booth at SIGGRAPH 2016, July 24–28 in Anaheim, California. Experience this phenomenal VR content and learn how it was produced.

Recommended Posts

Leave a Comment