Spring Cleaning: Multiscreen and Server Space
Understanding why multiscreen is so server intensive requires a quick overview of how video gets around among your iPad, smartphone, laptop, TV set, etc.
1. First, each video asset (movie, TV show, whatever) has to be chopped into little pieces, usually 2-10 seconds in length, to account for buffering in receiving devices.
2. Then each piece has to be replicated into different profiles that account for two things: The optimal resolution for each device and the different levels of network throughput.
For example, an iPhone screen has different resolution requirements than a home theater, and public WiFi at a park provides different throughput than a residential cable connection.
3. Each of those pieces has to be replicated yet again for each different adaptive bitrate streaming format. The three main ones today are Apple HLS, Microsoft Smooth Streaming and Adobe HDS. And there'll soon be a fourth, MPEG-DASH, once it's ratified. More about DASH in a minute.
4. Everything must be encrypted.
5. Finally, all these multitudinous little pieces have to be stored on servers so they can be sent out instantly.
The numbers for a 5-minute Youtube video aren't too daunting, but when you're dealing with tens of thousands of TV episodes and movies, the servers fill up fast.
MPEG-DASH is, among other things, an effort to reduce the total number of individual pieces by dropping from four streaming formats to just one. Some vendors, such as Envivio and RGBNetworks, have already brought out DASH-compliant transcoding gear. The big trick is getting all receiving devices to comply with DASH. They don't now -- and there's no guarantee that they will in the future. But DASH is still a move in the right direction.
Another way to get the storage requirements down is to do the transcoding, etc. on an as-needed basis rather than storing multiple versions on a server. RGB just brought out an upgrade to its TransAct Packager to do the adaptive bitrate formatting (RGB calls it "packaging") on the fly when a particular asset is requested. It cuts the storage requirement to a quarter of what's needed now, and also gets around the limited adoption of MPEG-DASH.
If packaging can be done on the fly, why not do all the steps on the fly? That should get the storage requirement down to just one version of each asset. The technology exists now, said Ramin Farassat, RGB's VP of product marketing and business development, in response to e-mailed questions.
"We do have a device that is capable of packaging/chunking/encryption and transcoding together," he said. "However, at this point, it is not cost-effective to transcode each requested stream on a per-request or per-subscriber basis."
The cost can be expected to come down, though.
"One would imagine that with processing power continuing to increase by Moore's Law," Farassat said, "that we would get to a point where we would be capable of transcoding so many concurrent streams in a device that the cost would be reduced and this capability would be cost-effective."
But while processing power is increasing, so are video quality requirements on the receiving end.
"A few years ago, we were happy to see stamp-size video streams on our PCs, but now we're receiving 720p HD video streams and soon will be moving to full 1080p HD streams," Farassat said. "The new iPad is now capable of supporting resolutions that are much higher than HD, and it is foreseeable that we would be moving to above HD resolutions in the future to provide even higher quality video to these type of devices."
Whether processing power can keep up with increased video demands remains to be seen, but it'll be interesting to watch. In the meantime, cable's video servers will continue to look more like hoarders' closets than something at Martha Stewart's house.
Ron Hendrickson is BTR's managing editor. Reach him at email@example.com.