Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

World’s most complete AV experience? Inside Glasto’s Dragonfly

The lead feature from Installation's latest edition examines Arcadia's spectacular Dragonfly at Glastonbury 2025, which fused light, sound, projection, lasers and pyrotechnics into a single 'living, breathing' structure – arguably the ultimate live AV experience, and the winner of Festival Production of the Year at the UK Festival Awards 2025

At this year’s Glastonbury Festival, Arcadia’s Dragonfly stage (set within the Arcadia ‘area’) set the standard for immersive AV experiences, fusing cutting-edge lighting, sound, projection, lasers and special effects. The Dragonfly debuted in 2024 as the centrepiece of Arcadia’s ever-evolving vision, following the previous Spider installation, which captivated festivalgoers from 2010 in an early iteration, and then 2013-2018 and 2022-2023 in its full form.

In 2025, Dragonfly’s technical capabilities were significantly upgraded, reinforcing its position as one of the most comprehensive AV installations in the world. This living, breathing structure – reimagined from a Sea King military helicopter, reborn as a 26m-long insect of light and sound – boasts a body of projection-mapped skin, eyes formed of transparent LED, wings of laser light, and a heartbeat of a 360-degree L-Acoustics system pulsing through the crowd. Dragonfly is set to remain at Glastonbury through 2027 and beyond, continually evolving as a symbol of innovation in live performance technology.

Pic: Matt Eachus (The Manc Photographer)

IMMERSIVE AUDIO
Inside the 80m-diameter, 360-degree Arcadia soundfield, the production team’s biggest audio challenge was coherence. “With a ring this big, you can lose focus in the middle,” explains Bryan McLean of Dirt Sounds, the system designer. “So we designed two soundfield systems: an inner and an outer ring that dovetail perfectly. The idea is that wherever you stand, it feels centred.”

The audio design centred around L-Acoustics L2 and L2D modules on ten towers, supported by KS28 and KS21 subwoofers beneath the structure. McLean continues: “It’s one of the few 360-degree rigs where you can literally walk around and the audio mix never collapses.”

A major leap forward came with the integration of Sennheiser’s Spectera digital wireless system, which was provided by AF Live. “The setup was seamless,” explains FOH engineer Rob Cook. “We had dropout-free coverage across the field — it just worked,” he continues, emphasising the benefits of Spectera in an environment that demands reliability: “The system held up even with 60,000 people in the audience. We had no RF interference and perfect clarity.”

Sennheiser engineers later noted that the Dragonfly deployment had been a “perfect stress-test” for Spectera’s wideband RF design, especially in one of the most crowded RF environments on Earth – Glastonbury Festival.

WINGS OF LIGHT
If 2024’s debut was about flight, 2025 was about wings. “Last year we used 30-watt lasers,” explains Abby Shum, laser programmer with Aardvark FX. “This year we doubled down – 4 × 100- and 4 × 60-watt IP-rated lasers formed the Dragonfly’s wings, plus 6 × 45-watt lasers created a helicopter blade ring effect above the arena. The first night we powered them, you could see the beams for 10 miles!”

The lasers, controlled via Liberation software, opened each evening’s time-coded Awakening show before giving way to hours of live ‘busked’ performance. “We wanted to create an intense visual effect that wrapped around the audience,” adds Shum, who worked closely with the rest of the team to ensure the lasers remained an integral part of the overall AV experience.

Pic: Matt Eachus (The Manc Photographer)

FIRE AND AIR
Thirty Galaxis G-Flames, twelve flaming bullrush sculptures, and eighteen tower-mounted flame heads turned heat into architecture. “We wanted the flames and haze to feel like part of the lighting,” says Katie Davies, Arcadia’s technical production manager. “Everything feeds into the illusion that the Dragonfly is alive and breathing.”

The fire elements, alongside more than 30 smoke and haze machines, filled the space with atmosphere. “The flames don’t just shoot up — they’re part of the action, integrated into the full sensory experience,” Davies explains. The fire effect, especially when paired with the intense laser beams, created a sense of danger and excitement that matched the energy of the crowd.

“We’ve had to balance brightness and visibility,” Davies continues. “Sometimes, we pipe haze in to enhance the laser beams, but we have to be careful not to obscure the crowd. It’s a constant process of fine-tuning.”

TECHNICAL FEAT
Dragonfly’s projection-mapped surface was a technical feat in itself. Using four double-stacked Barco 32K laser projectors on the body and an additional four 26K units on the tail, the Dragonfly sculpture was fully mapped for the first time in 2025. The projection design, led by Joe Crossley of Astral Projekt, used real-time imagery triggered from the live music. “The skin comes alive,” Crossley enthuses. “The content responds directly to the music, animating the Dragonfly in real time. It’s a combination of art and technology.” The content was generated using Unity and TouchDesigner, adding a level of interaction and immediacy that matched the live performance atmosphere.

The Dragonfly’s eyes – more than 200 transparent LED panels – were collaboratively custom-designed by Ben Rushton-Vaughan of Cucumber Productions and Dave Whiteoak of Video Illusions. “We built our own geometry,” Rushton-Vaughan explains. “Every polygon has a slightly different pixel pitch so it stays transparent but still forms a seamless image. It’s a sculpture that also happens to be a screen.”

Hive Media Servers, mounted close to each projection cluster and LED node, form the brain of the system. “Rather than one huge rack, we network small units that each render a part of the picture,” adds Rushton-Vaughan. “It’s lighter, faster to rig, and if something fails, you swap one box instead of the whole system.”

According to Hive, Dragonfly marked a milestone in scalable real-time playback – using  6x Beeblade Nexus in 3x Nucleus enclosures running the company’s Beehive software to deliver frame-accurate synchronisation across every projection, LED, and mapped surface. Each server handled its own local render workload but remained in perfect sync through Hive’s distributed-render architecture.

“Dragonfly’s environment pushes every aspect of media playback,” noted Hive co-founder Nigel Slater in the company’s own Glastonbury case study handout. “Extreme weather, dust, vibration – and tens of thousands of people moving through the field – all while maintaining sub-frame sync between projection, LED, and lighting data streams.”

Pic: Charlie Raven

REAL-TIME PLAYBACK
The system was monitored live through Hive’s dashboard interface, allowing technicians to visualise playback health and network performance in real time. The modular approach also enabled Arcadia’s team to integrate Hive directly with TouchDesigner and Unity, using OSC triggers (commands sent using Open Sound Control to trigger events in other software, allowing them to be synchronised and controlled over a network) for music-reactive content. The result, says Slater, is “a genuinely distributed AV brain that lets the structure think and breathe in unison”.

Lighting in the arena radiated from ten towers, ensuring true 360-degree coverage. The setup combined Aquabeam MK2s, Robe iEsprites, Chauvet Colour Strike M strobes, and Martin Mac Aura XIPs, offering dynamic effects that matched the energy of the crowd. “The idea is to make the lighting an integral part of the show,” says Dave Cohen, lighting designer from Mirrad. “We can’t just light the Dragonfly – we have to light the entire arena.”

Also used were 120m of Martin VDO Skeptron LED battens for architectural lighting on the body, which were pixel mapped to run as a video surface, as well as a lighting fixture – along with a custom LED designed to light the face, also running as a video surface.

Cohen’s team designed the show using twin Avolites D9 consoles and a Prism media server that takes an NDI feed from the video system. “The Awakening sequence is time-coded,” Cohen notes, “but the rest of the night is completely live. We’re visual-DJing for hours, responding to the crowd in real time.”

The Dragonfly’s DJ booth is an integral part of the AV spectacle. Suspended within the Dragonfly’s body, the booth is framed by custom transparent LED panels, allowing the crowd to see the artist while maintaining the ‘living’ aspect of the structure. The DJ booth head is also articulated on hydraulics to allow it to move from side to side whilst the DJ is playing to give the crowd a better view.

“It’s a unique setup,” says Ben Rushton-Vaughan. “We specifically designed the LED screens to allow for full transparency, so the DJ remains visible while still being enveloped by the visual effects of the Dragonfly.”

This transparency is key to the visual impact, giving the impression that the DJ is part of the Dragonfly itself, with content continuously shifting across the booth’s LED panels. “We wanted it to feel like the DJ was part of the creature, not just performing on top of it,” adds Davies.

The video content on the DJ booth’s LED panels is synced with the rest of the projection mapping, creating a seamless experience that keeps the visual and audio elements of the show perfectly in sync. Content from Joe Crossley’s team is used to animate the DJ booth, immersing both the performer and the audience in the overall experience.

Pic: Chris Cooper/ ShotAway/

CRITICAL COMMS
“If the bolts hold the Dragonfly together, comms hold the people,” says Harry Brown, Arcadia comms manager. His hybrid Clear-Com HelixNet + FreeSpeak II topology links 40 wireless and wired packs and ten tower-mounted transceivers. “We’ve built redundancy into everything – twin base stations, mirrored wiring, UPS for two hours of runtime,” Brown explains. “The idea is that even if something fails, you can still trigger an emergency stop or talk to safety instantly. In this show, comms isn’t just convenience – it’s life support.”

A network of specialists worked in concert: Bryan McLean (Dirt Sounds), Paul Rose (AF Live), Ben Rushton-Vaughan and Dave Green (video), Joe Crossley (content), Alan Trott, George Coward-Davies, and Abby Shum (Aardvark SFX & lasers), James Bunning, Jake Cawkwell (lighting Arcadia) and Dave Cohen, Sam Werret (lighting design Mirrad), and Harry Brown (comms), Dave Whiteoak (video illusions), Dave Green and Nigel Slater (Hive).

Davies, who oversaw the integration of all technical systems, explains: “Every discipline had to work as part of the same nervous system. Audio, video, lasers, flame, projection – they all had to pulse together. That’s what makes it feel alive.”

From its rotating DJ booth to its laser-laced wings, Dragonfly has become the symbol of Arcadia’s philosophy: industrial and military salvage turned into spectacle; technology turned into emotion. At midnight in the Arcadia field, surrounded by up to 50,000 people, the machine breathes – perhaps the most complete AV installation anywhere on Earth.

Dragonfly won Festival Production of the Year at the UK Festival Awards 2025, announced on December 3rd.