Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now


Assessing the AV content boom

Virtual production and corporate streaming events were among the topics at the recent AVIXA UpStream event in London, which explored multiple areas in which AV and broadcast technologies are now converging. Installation reports

On June 26, as the latest political crisis unfolded a few hundred metres away in Westminster, there was a rather calmer – albeit no less eventful – brand of conversation taking place at the QEII Centre. Reflecting the greatly expanded expectations of content production and streaming skills by AV professionals, AVIXA’s UpStream conference aimed to guide visitors through some of the key challenges of capture, storage, streaming and distribution.

Introduced and hosted by futurist and author Amelia Kallman, known in this magazine mainly for her contributions at ISE, as seen in the Installation-produced ISE Daily, the programme offered a diverse mix of case studies, technology deep-dives and panel sessions – one of the most striking of the last-named being ‘Virtual and Beyond Virtual Production’. Moderated by Deep Blue Sapphire Technology co-founder Ciarán Doran, the session featured contributions from Chaos Inc founder Stacia Pfeiffer, Jacobs Massey lead AV technician – production division Keenan Bailey, and Imaginary Pictures technical director Kevin Zemrowsky.

As Installation readers will be aware, virtual production has undergone a huge expansion during the past 3-4 years (see pages 28-35 for more). But according to Zemrowsky, it’s now moving into a distinct new phase: “It’s worth noting that there has been a massive bubble because of Covid and the investment that took place before it. [In pro AV] there was a space where people were doing things because they had money assigned to it, they had received government loans, or they couldn’t do things they would normally do as a business [due to pandemic conditions]. There were a lot of new services and ideas about content creation, but in some cases those things didn’t necessarily find a market. That moment [of massive expansion] has now passed.”

Hence we are now witnessing the emergence of a more mature market that, in each instance, could be shaped around the creation of a single key asset that is adaptable to multiple scenarios. Zemrowsky suggested we could see a focus on the creation of “core pieces of intellectual property that are regarded as the start of a process that goes through the whole value chain and into all these different spaces including VR, AR, etc.”

From Bailey’s perspective, the convergence of AV and IT requires more attention in VP’s next stage of development. “There is a need to work with the various departments and sometimes they are not all integrated,” he says, indicating that the level of progress varies across the sector. “A lot of money has been spent on building some of the studios, but they are not necessarily being used. [Meanwhile] third-party production companies are often being brought into facilitate events.”

Meanwhile, Stacia Pfeiffer encouraged the audience to see VP as part of a continuum that has clearly got a very long way to run yet. “Once we talk about getting into immersive storytelling, you can think about it as travelling from 2D to 5D,” she said. “So with 2D you have film and a very traditional format; with 3D you have animation; and from there on you get into virtual production or production with a green screen that allows you to have digital content overlaid, sometimes in real-time, which means live in the broadcast sense. Now to add a further layer, you could be looking at the adaption of haptics, which is beginning to be used” – haptics being a catch-up for a growing group of technologies that stimulate the senses of touch and motion in remote operation or computer simulation.

Responding to Doran’s suggestion that “going beyond VP will involve not just the environment in which the content is created, but the environment in which it is received,” Pfeiffer was strongly affirmative. “Absolutely. We are seeing big changes in the way content is consumed; you could see the move from black & white to colour television as a parallel.” And with Generations Y and Z, in particular, it’s the way that “people will begin to expect to consume content in the future.”

VP and related developments are undoubtedly shaping a potentially highly complex future AV landscape in which companies will have to be more forward-looking and fleet-footed than ever before. So it was perhaps something of a relief to consider a slightly more straightforward narrative in a panel entitled ‘Understanding NDI’, where IABM head of knowledge Lorenzo Zanni spoke to NDI head of strategy Miguel Coutinho and Digibox managing director Marc Risby.

The session provided a wealth of insight into the remarkable ascendancy of the NDI video connectivity technology since it was first revealed by NewTek at the IBC Show in 2015. Now a staple of content infrastructures in broadcast and AV, NDI was originally conceived by developer NewTek as a means to connect its TriCaster video production systems to PTZ cameras. “But then at some point it was noted how great it would be if anything connected with NDI – not only our products, but those from third parties as well,” said Coutinho.

The result has been the rapid development of a product ecosystem that now includes hundreds of hardware manufacturers and software applications. Invited to consider the primary reasons for its success, Coutinho stated: “It is really easy to use, so [for example] you can have a group of cameras on the same network using native NDI, open up NDI-enabled software on the computer, and the cameras will pop-up; there is nothing else you need to do. And that’s a very key reason why it’s gone beyond broadcast into AV.”

The ability to employ standard network switches has been another major factor in its favour, according to Risby: “The fact that you can use standard off-the-shelf 1GB switches with NDI and have it work fine has helped to bring the costs down and make it much more adaptable to [different applications].” 

Having one primary originator of NDI also seems to have accelerated its evolution. Risby draws a comparison with SMPTE ST 2110, which “took six years to come out as you had about 25 different companies making the standard. Whereas NDI came out in 2016 and is now in Version 5. It was evolved by one group with one primary objective, and offered in free and paid versions.”

With Coutinho noting that Version 6 is due to debut in Q4 this year, NDI also continues to reach new – and sometimes rather surprising – end-users. “Corporate and offices are the next ‘big one’ after more traditional pro-AV and broadcasting, and security as well,” he says, before adding: “We are also seeing the start of adoption in the consumer segment. If someone had said in 2015 that NDI would be being used for baby monitors in 2023, then I don’t think anyone would have believed them!”

The day also featured a panel entitled ‘Streaming Sessions: Platforms, Practicalities & Pitfalls’ that was moderated by regular Installation contributor David Davies, and featured Florical Systems SVP/GM Shawn Maynard, AVTEAMUK MD Matthew Thompson and AVIXA VP of content delivery Sam Minish.

There was a complete consensus around the notion that the pandemic had raised requirements of streaming session quality in core AV applications like education and corporate. More than ever, noted Thompson, “there is an expectation of broadcast quality, and that can be seen in the increasing use of systems in AV” that are prevalent in more traditional broadcast applications.

Many of these developments are encapsulated in the recent trajectory of AVIXA’s own streaming service, which started from scratch during the pandemic “in response to a very specific need for information among the user-base,” recalled Minish. From an initial set-up based around PTZs, the operation has since scaled up to encompass manual cameras and a more refined sense of what constitutes resonant content. “People don’t necessarily have the time to watch long streams, so we have developed [more condensed formats] focused on specific subjects.”

In terms of streaming specifics, Thompson noted that it tends to be “latency and budget” that determine which of the many available platforms are employed for a given project – although the appeal of some providers, such as YouTube and Vimeo, now cuts across project types and applications. Detailed pre-event briefings and schematics are hugely beneficial to the successful delivery of an event, while AI-enabled captioning and the incorporation of more virtual elements are predicted future trends.

There was also a reminder from Maynard that audio quality is now more prized than ever before. “If you ask people about it, there is definitely more tolerance of a [below-par] video experience than an audio one,” he said, noting that inadequate audio can really determine “whether or not an event is successful”.

With other sessions including a detailed look at making streaming more sustainable, a discussion of multi-location production workflows, and a case study from Ross Video about ways to drive engagement and collaboration across the corporate environment, AXIVA UpStream provided a well-rounded insight into the ever-growing number of areas and applications in which different professional technologies are converging.