Video Transport: delivering multiviews in cloud-based productions for OPPS, USA

A man in studio is checking  video on a few screens

Recently, OPPS hasn’t been doing many off-premises multi-camera productions, the subject of our earlier story. Instead, they’ve been using Video Transport in cloud-only productions, mostly for corporate meetings. They're currently using it to deliver multiviews with low latency to production teams.

It's really great for delivering multiviews with low latency to the production teams. It is, probably, the lowest latency distribution method that we’ve found so far.

What exactly are you using Video Transport for?

We mainly use it to deliver our multiviewers and we really like it. We have explored other options, but we find that they are more complex to configure and use than Video Transport and have no functional advantages. We believe that simpler is better and prefer Video Transport for that reason.

Who are you delivering the feeds to?

Those working within the studio do not need it, because they already see everything in their systems; however, clients, producers, show callers, and remote teleprompter operators require it. We typically have at least four people connected to the multiviewer for any given production. We also use Video Transport (Publisher and Receiver) to get multiview feeds from AWS back to our office. This allows us to monitor what's going on during the day by simply selecting an NDI source on the network. For instance, if there are four productions happening simultaneously I will use Video Transport to get the feeds back to our office network, or even onto my NDI network at home just to keep an eye on things.

And your product has enough latency for that to work. We have people on Unity Intercom while they're watching the Medialooks feed, and that's really all they need to participate in the production remotely.

How often are you using the product?

Every cloud-based production uses it. I would say we use it probably three days a week. And it is critical, too. Everything else could work, but if we didn't have this, we'd have to send our production feeds through Zoom or something with a worse user experience.

We have had an instance when a director became ill during an onsite production, and we used Video Transport to send them a web link to the multview, and then connected our Dante intercom to Unity to enable them to call a real show from their laptop while sick in bed.

A man in a studio with multiple monitors and mixers

What is a typical kind of show produced this way?

A typical show would be like the one we do for a global medical company. They work with us about multiple times a month, and they have a number of presenters who join from all over the world via Microsoft Teams. We extract these people from Teams, play their videos, put their PowerPoints together, polish them up, and then publish them out to an RTMP destination for their employees to watch. This type of show typically has a show caller, a technical director who runs a vMix system, a talent manager who also runs a vMix system to control the feeds to and from talent, and a gatekeeper who runs a vMix system that plays the pre-show rolls, takes the technical director's feed, and plays the post-show rolls. We also have a graphics operator who runs the PowerPoint systems, and a producer who watches the multiview feed and calls the show. Everyone is on Unity, everyone is at home, and all the systems are cloud-hosted.

So, it’s a video conference, right?

Yes, correct. It's usually about six people that are presenting corporate content that gets pushed out to a streaming platform with maybe 5,000 viewers.

What have you achieved with Video Transport?

It made cloud production realistic to pull off without it feeling like a bag of workarounds. The product is now a critical part of how we work. It's now less of a network engineering challenge and more of a production challenge, which is great when you have production people. So that's really what it has enabled, and it's also a really useful tool to have in your pocket for situations like the one I mentioned where the show director got sick.

The other thing we've used it for on location is when we've had a big general session in a large ballroom and the production office was on the second floor of the hotel. We've used Video Transport to get camera views and other content over the Internet, but basically over the LAN, to the production office. This was great because it was not possible to send an NDI feed up there due to the network topology.

This was more about logistics of the production process, right?

Yes, this eases the administration of events since most of the signals passed around on a physical event site are related to back-end processes, such as multiviewers and speaker timers, which do not require a 10 or 100 Gb network trunk. I sometimes have conversations with people about SMPTE 2110, who are obsessed with uncompressed video, which has it’s place; however, there are more items by count that NDI is totally sufficient for.

What are your thoughts on 2110?

It has its place. I think that if you can maintain uncompressed video from start to finish, it makes sense, especially if you're doing keying. If you're going to produce the Super Bowl, I think it makes sense. If you're doing a corporate meeting that's going to be streamed in 720p and not even be watched full screen, I don't think we need to be using 2110.

What do you like about Video Transport?

We like that it's easy to deploy. It's basically a desktop application that's easy to install. We appreciate the low latency. The auto-negotiate feature is extremely handy because we don't need to manage port forwarding or public IPs. If it were an SRT feed, for example, there would be no TURN or STUN servers. You would need to have port forwarding at one end or the other. This is a huge thing, especially when you have two remote sites that are not under your administration.

The ability to use WebRTC easily without the need for any third-party licensing to deliver low-latency feeds between sites, for example, is really good. And the web preview link is what we love the most. We do really like that you supported ASIO audio drivers at some point – embedding audio in NDI feeds is something that doesn’t often have a reasonable solution. And putting a GUI on SRT feeds, in general, is a good idea.

See also