Nominated for a 2025 Primetime Creative Emmy in the Emerging Media category, Shawn Mendes: Red Rocks Live in VR is a groundbreaking venture in the world of immersive music concerts.
Created by Meta, Light Sail VR, Dorsey Pictures and 7 Cinematics, this project delivers the ultimate front-row seat to the concert recorded last October at which Mendes performed his latest album, Shawn, in its entirety.
It is also the latest in a series of high-quality VR music experiences designed to be experienced in a Meta Quest device with performances by artists Louis The Child, Tyler Childers, Santa Fe Klan and DJ Alison Wonderland featuring in the Emmy-nominated first season of ‘Red Rocks Live in VR.’
“We built our workflow on RED right from the very beginning,” says Vincent Adam Paul, CEO, 7 Cinematics. “Our original RED was an Epic Mysterium-X, serial number 000302, and we’ve continued to build our ecosystem around RED through all iterations of the cameras first in the 2D world and now into immersive 3D.”
Red Rocks is a stunning outdoor amphitheatre carved out of red sandstone in Colorado, with a seating capacity of 9,500. Nighttime shows are spectacular and demand a camera that can capture its beauty as well as all the lighting and pyrotechnics of a live stage event.
“Two of the biggest issues when filming any concert performance are confetti and laser lights but with RED the dynamic range (rated 17 stops with up to 20+ in Extended Highlight mode) is incredible,” says Robert Watts, managing partner and executive producer at the creative studio Light Sail VR, a specialist in immersive storytelling. “The dynamic range of a nighttime shoot at Red Rocks really comes through when you're shooting RED. It always looks like you're actually there.”
For the 83-minute Shawn Mendes concert the team arrayed a variety of camera systems at Red Rocks including the RED V-RAPTOR, with a Canon RF 5.2mm F/2.8L Dual Fisheye lens in key positions front of house, on a drone and on a jib.
“A touchstone for us is intimacy,” Watts explains. “For me, VR is about presence – the idea of being in a particular place. We’re trying to replicate the feeling of being there. We see everything from our eyes and from our POV and we want to make it feel very authentic and natural.”
Conveying this sense of presence requires an understanding of how the inter-pupillary distance (IPD) - the gap between the centers of the dual lenses - translates into the optimum distance from camera to subject.
Since the Canon Dual Fisheye has an IPD of 60mm, which is close to most people’s own IPD, Light Sail VR operates in a sweet spot of 5 ft to 15 ft from the subject. One key difference in the storytelling for VR is that camera movement is slower and more considered.
Storytelling cadence in VR180
“Whereas a 2D multi-camera plan has grown into a big symphony of shots to include all manner of camera moves on jibs, cable-cams, drones and Steadicams, shooting VR is more about camera placement because the experience is so personal,” advises Paul. “The lenses are fixed focal length, and the cameras are all locked off to avoid lateral motion which can make a viewer feel uncomfortable when they're not expecting it in the headset.
“We still do slow pushes in and pushes out. We can crane up and crane down because people are starting to get their ‘VR sea legs’ if you like, and getting used to the motion and appreciating the dynamism of the motion.
“Eventually, I think the 2D and the 3D experience will merge but right now we're trying to ride the razor edge of technology to get to that place.”
The Shawn Mendes VR experience was produced, directed and cut in a similar way to conventional concert films destined for theatres. “Every shot has tempo and flow,” says Paul. “It's cut like a movie but optimized for viewing in a headset.”
Watts confirms, “There is a cadence to VR storytelling that is a little slower, but you can actually do frequent cuts. We're cutting every seven or eight seconds. It's not like we’re using long establishing shots. We have enough coverage between all the camera systems to cut between them and create a seamless experience. You can do a lot of really interesting things once you're working with a post team that understands the geometry of how best to capture and edit for a headset.”
Light Sail VR built a preview system which can output a live feed for up to multiple headsets for select crew and representatives at each show. Watts says, “We can basically live switch between each of the camera positions. We'll see the flat Fisheye feed from every single camera position on a monitor and then we'll have the wrapped VR180 viewable in the headset so the band’s management or Meta executives or the artist themselves can come up and check it out.”
The choice of 180 format, rather than full VR 360, is considered preferable by immersive content producers and hardware developers including Meta, Google and Apple for subject based content while VR360 is more suitable for location-based content.
“We have a phrase we use here called ‘pixels per degree’,” Watts explains. “By producing in VR180 versus VR360 you can push all the pixels that would be basically wasted behind you into the front screen and make the resolution much higher and dynamic.”
Adds Paul, “If you're going to the Pyramids and you want to look all around, I'd shoot that in 360 but if you're shooting U2 at the Pyramids we're going to do it in 180 because you're going to be looking at U2.”
In turn, that entails working really closely with the band and their management to make sure that we get our cameras in the right place to produce a premium VR experience without blocking the sight lines of the audience.
“You can’t even buy a ticket to some of these locations because you are on stage from a reverse angle at the audience, or on a jib, or a drone. VR180 delivers a really rich visceral experience.”
Optimized V-RAPTOR
Light Sail VR used V-RAPTOR cameras owned by Meta, and custom modified by RED to remove the Optical Low-Pass Filters (OLPFs). The team then equips each body with the Fisheye lens to turn it into an 180-degree immersive imager.
“We remove the OLPF to increase the sharpness of the image when paired with the Canon Dual Fisheye lens,” explains Matt Celia, creative director, Light Sail VR. “We did tests and found that without this removed, the image was less sharp than Canon's R5C. Removing it dramatically increases the sharpness as well as giving us all the benefits of V-RAPTOR with the huge dynamic range, professional connections, and robust construction.”
Capture is at 8K 59.94fps with the final stream delivered as an 8192x4096 file to Meta, but the resolution audiences see is determined by their internet speeds. For best results, Light Sail advises users to ‘cache’ the high-quality playback in Meta Quest TV which renders the full resolution video.
“Recording at high resolution is critical with Fisheye lenses because the number of pixels per degree of the lens is vital to the perceived sharpness,” Celia says. “On RED V-RAPTOR we're able to get around 22 pixels per degree.”
Post, audio and data management
Recording is made directly on to the cameras. In post at Light Sail VR, each of the Fisheye feeds are brought into Resolve and flattened into a single equirectangular video for editing before finishing by adding VFX and noise reduction. An editor will cut as normal on a conventional monitor and review cuts in a headset. The final cut will be re-wrapped into a sphere for streaming to a Meta headset.
As you can imagine the data throughput from camera to post is extraordinary with each VR concert project running anywhere from 50 to 100 terabytes.
Ambisonic mics are placed at every camera position to capture spatial audio. The artist provides their final mix as well as the mixed stems with effects. This is handed over to sound design team Q Department, based in New York, to spatialize the mix for Meta headsets.
“They blend in the audience reaction from each camera position so every time you switch camera angle it doesn’t feel like you're in a different position,” adds Paul. “It's a nice balance of being at the concert and feeling like there's people around you. So, when you hear a guitar solo you want to move your head to watch the guitarist. We lead you through that by cutting to the guitarist in VR180 so now you’re immersed with the guitarist.
“As a producer going through thousands and thousands of hours of footage I rarely look [behind you] because I want to focus on the band on stage in front of me. When you watch VR180 you shouldn’t really be aware that there's empty space back there.”
Broadcasting live VR is possible using RED Connect to stream RAW 8K files direct from the V-RAPTOR over IP to a CCU in real-time, but the market for live VR needs to mature.
“In an ideal world, using RED Connect would absolutely be a very advantageous workflow because we could monitor each camera from a video village where we pipe in all the live preview tech,” says Celia. “We could even press a button and go live with an 8K stream which would be very cool! Maybe for next season!”
“There can be no mistakes”
As it stands, the production of the concert shows for immersive 3D actually feels like a live shoot every time. “There can be no mistakes,” Paul stresses. “When you're filming a sold-out show live for one night only, we have very little time to prep. One of our biggest jobs at 7 Cinematics is acting as the liaison between the artist and our team so we can put cameras in place during the sound check. That's about the only opportunity we have.
“There is no rehearsal. We come in, we place our cameras, and we walk through the stage management with the production manager, tour manager, and overall management. They'll tell us, ‘Yes’ or ‘no’ or ‘maybe.’ We push everything a little further by showing them what’s possible with VR in the headset. Then we’ve got to do it live.
“It's literally like a train passing the station. If you're not onboard, it's leaving without you.”
Two more Red Rocks Live in VR shows have since landed on Meta produced by Light Sail VR, Dorsey Pictures and 7 Cinematics featuring performances from Grammy nominated Omar Apollo and Norwegian singer-songwriter Girl in Red.