Stadium networking – video and WiFi considerations4 November 2015
The first part of this feature looked at some of the benefits of converged AV and IT networks within stadiums. Here, David Davies looks at the infrastructure demands generated by these venues’ video and WiFi requirements.
By general agreement, it is the video requirements of a stadium that are currently presenting the most significant challenges to networking infrastructure. Exterity is one of the leading names in this area and has implemented IP video solutions in a variety of major venues, including Arsenal’s Emirates Stadium and Olympique de Marseilles’ Stade Vélodrome. Among its client base, CEO Colin Farquhar confirms the overwhelming trend towards convergence, noting: “We are very used to seeing our customers almost using a single network infrastructure to deliver all their content – and therefore not making a distinction between IP video systems, telephony systems, etc.”
With such converged networks, it may be that “virtually segmenting the network infrastructure is less of a requirement” – although packet prioritisation will be needed for priority communications and messaging. “It really comes down to good practice on the network side and simple quality of service measures,” says Farquhar. “Certain data types can be prioritised and forwarded to the network so they get there first. For instance, two-way real-time telephony is clearly something that needs to be prioritised ahead of some other data types.”
Despite its own burgeoning interests in stadium video, Farquhar acknowledges that video content itself is not necessarily among the chief network priorities. “Important though it is, the reality is that for the general operation of a stadium a bit of delay or a flicker in the corner of the screen is not as relevant as the telephone call to say that something is happening somewhere in the stadium.”
ZeeVee is another company that is well placed to chart the evolution of stadium video. The US-based provider has lately been making industry waves with its ZyPer4K system, which allows the distribution of uncompressed control, audio and video data, with resolutions up to and including 4K supported. Its reliance on 10Gb networks means that it is consequently restricted to either fibre or Cat6a infrastructure.
“Most, but not all, existing venues are heavily wired with either coax or Cat5 twisted pair solutions – neither of which can be used for 10Gb. So we are finding that only those venues recently constructed or renovated have made the move to either fibre or Cat6a infrastructure,” says Danny Barr, vice president new product business development at ZeeVee.
The longer distance requirements of stadiums and reduced material costs (“on a par with shielded twisted pair,” notes Barr) mean that fibre is particularly prevalent in new or upgraded venues. “New buildings usually put fibre everywhere, since it’s so much more convenient and cheaper than copper,” says Marc Brunke, founder and general manager of optical networking technology specialist Optocore. “Refurbishments are likely to get even more fibre, since it is much easier to put fibre into an existing wall than copper.”
In the case of ZeeVee, this shift towards fibre translates to a rapidly growing market – particularly in venues used for sports, where there is greater requirement for low-latency video distribution to both large-scale displays and screens located in hospitality boxes and suites.
Barr explains: “Our ZyPer4K distribution system lives on a 10Gb network and can quite easily co-exist on the same network switches and cabling as other core data devices. By utilising the total available bandwidth of any link (10Gb full duplex) and the extremely high bandwidth available on 10Gb network switch backplanes (a 64-port 10Gb switch supports 1.28Tbps on the backplane with ultra-low latency), we are able to provide 1Gb Ethernet connections at every location where there is a ZyPer4K encoder or decoder. Customers are using this ‘free’ 1Gb connection for things like core IT functions, VoIP telephony implementations, security cameras, video streaming direct to displays, WiFi hot spots, and even separate audio over IP/PA deployments.”
The WiFi consideration cannot be overstated – Barr sites an “explosion of user needs” around wireless access, and he’s certainly not alone. Alongside substantial expectations of public display video within the venue, many fans also wish to simultaneously enjoy video content via their smartphones. This is particularly relevant to sports, where there is a desire to access material such as instant replays and detailed statistical information as fans watch the action unfold on the pitch or track in front of them.
Farquhar confirms: “The big discussion taking place just now on the video content side is around the capability of the wireless [systems]. Most venues have a reasonable wired infrastructure – and certainly that is the case with regard to the new-builds – but now they want to be able to provide visitors with a range of services and relevant content over the wireless infrastructure.”
Investment in high-density wireless networks that may involve hundreds of access points in a large stadium is therefore rising up the priority list: “If you think that 10,000 people might want to watch a replay of a match then you could be looking at 10,000 x 10Gb of content being delivered from the server infrastructure to all those devices. That is a significant burden on the video distribution infrastructure and the wireless system.”