A 44ft by 16ft ROE LED wall is part of the pioneering Immersive and Reactive Lab and XR Stage, which is one of the new facilities installed in the expansion of Miami University. In addition, the integration of advanced technologies, including Stage Precision software, has helped make the new lab a hub for exploring the boundaries of virtual production (VP) and extended reality (XR) experiences.
The Immersive and Reactive Lab and XR Stage is in the McVey Data Science Building, which opened in March 2024, housing The Department of Emerging Technology in Business & Design (ETBD). As well as the ROE LED wall, it is home to a 50ft by 70ft stage, Disguise media servers, nDisplay workflows, and other media pipelines for two cameras. The main camera is a RED Komodo with six Zeiss Prime Lenses and a Canon Zoom Lens.
Benjamin Nicholson, assistant teaching professor and immersive and reactive lab and XR stage director at Miami University, said: “We have around 70 students enrolled on courses here for VP and XR. The groups are learning everything from motion design and creating creative visuals for live music. They utilise immersive and reactive tools such as Notch, TouchDesigner and Unreal Engine to make virtual production stages; this is also where they are learning how to use SP from Stage Precision.”
Nicholson added: “In the industry, people are talking about Stage Precision and the things that can be achieved through the unified workflow it provides. In the context of the lab, SP allows us to take the control and management out of several individual native softwares and hardwares and put them all into a single interface that can be used for calibration and control.”
When it comes to training the next generation of creative production professionals, the facilities at Miami University are pioneering a new way of working, Nicholson believes. “For the learners that can already understand the significance of a tool like Stage Precision, they’re enthusiastic about using it in different ways. It’s an advanced programme, but training in this from the start will give students a high level of knowledge and understanding that they can use in the real world.”
He continued: “The most pivotal thing about SP is the lens calibration features. We built lens profiles in SP which take the data input from RedSpy for optical tracking. Camera two offers another great example of the ease of integration of different systems into SP.”
He concluded: “With SP we can run several parallel systems at any one time. We can run a Disguise virtual production, set up a TouchDesigner system or anything else, run them at the same time and switch between them. We have multiple users who can change the SP interface from up to 15 different computers. What Stage Precision is doing is providing a single point of tracking distribution to all the different media servers at once.”