If you watchedThe Mandalorian, you know the best part about it was the special effects. Everything looked real. It was a level of authenticity people were not used to seeing on television. Scenes popped off the screen and transported you into new worlds of adventure.
But what was showcased on The Mandalorian was a long time in the making.
It was actually a technique started onRogue One and has evolved with every Star Wars film since then.
They call it "StageCraft" and it has revolutionized the way we set up and shoot. It's the biggest thing since the invention of blue and green screen and incredibly important to the future of film and television.
So what is it?
To understand this complex leap in technology, let's jump into the past.
While shootingRogue One, director of photography Greig Fraser, ACS, ASC, ran into some problems. The cockpit scenes were not turning out the way he wanted, so he came up with the idea of shooting cockpit scenes using an LED screen that displayed the exterior space environments.
Why was this such a great idea?
Well, the process allowed cinematographers to capture all the special effects in-camera, which is what makes things look real. It would also allow for interactive light and reflection effects happening in real-time. That means there would not need to be effects added in post by Industrial Light & Magic (ILM).
So, it was full of wins for every team involved in the movie.
This was an impressive windfall, and soon it was sent all the way up to the ILM VFX supervisor Richard Bluff, along with associate Kim Libreri and ILM creative director Rob Bredow.
As you know, ILM prides itself on being involved in the cutting edge of this kind of tech. Their conversations with Fraser impressed them. And from this one random thought to shoot cockpits better, they asked one big "what if."
What if you used LED lights and effects in a bigger environment, say, to stage an entire landscape or environment?
Credit: Francois Duhamel, SMPSP
So, Greig Fraser sat with ILM and they formulated a plan. They talked about trying something on this scale on their new TV show,The Mandalorian, mostly because they knew its creator would be game.
As Fraser explains, “Jon [Favreau] was adamant that due to the scope and scale of what would be expected from a live-action Star Wars TV show, we needed a game-changing approach to the existing TV production mold. It was the same conclusion George [Lucas] had arrived at more than 10 years ago during his TV explorations, however, at that time the technology wasn’t around to spur any visionary approaches."
Well, now they had the vision, visionary, and technology.
So how did they bring it all together?
Credit: Melinda Sue Gordon, SMPSP
Video games are constantly pushing the envelope when it comes to motion graphics. While the characters they put forward are animated, they're becoming more and more "real" every day.
So when they pitched using the popular game engine Unreal Engine by Epic Games to support the backdrops in creating a reality with practical effects, Favreau was excited and onboard for the adventure.
The tool created allowed for real-time display on LED screen walls.
The prototype wall had a 35-foot-wide capture volume. It was covered with 2.8-millimeter LED panels for the utmost precision.
Fraser toldICG magazine, “The results on screen have a lot less moire, which is the trickiest part of working out the shooting of LED screens. If the screen is in very sharp focus, the moire can come through. That factored into my decision to shoot as large a format as possible, to ensure the lowest possible depth of field.”
To capture this background, they shot with the ARRI ALEXA LF. The reason was that Panavision built the Ultra Vista lenses that had a fast fall-off, allowing them to duck the moire issue and to shoot anamorphic.
They used Panavision's prototype lenses for the 75mm 100mm. And they also used T2.5 50mm, 65mm, 75mm, 100mm, 135mm, 150mm, and 180mm lenses to cover every focal length.
Fraser said, "Combining the LF sensor with the 1.65 squeeze on the Ultra Vistas, you get a native 2.37 [ aspect ratio]. Those lenses have a handmade feel in addition to being large format, and a great sense of character. We didn’t use any diffusion filtration at all.”
But even with all this testing, they still needed StageCraft to be greenlit.
Credit: Francois Duhamel, SMPSP
There was a lot of concern around the project that it was going to be expensive and look cheap. We know now that that was not the case, but in a universe determined to mix the practical and digital, it felt like everything was at stake.
Lucky for them, and us, ILM greenlit the project after seeing it in action.
But that was not the hardest battle.
That came in mixing tech from a ton of different outside sources for one internal strategy.
ILM creative director Matt Madden, who served as The Mandalorian’s virtual production supervisor, says “UE machines handled all content rendering on the LED walls, including real-time lighting and effects, and projection mapping to orient the rendered content on each LED panel, deforming content to match Alexa’s perspective. For each take, the StageCraft operator was responsible for recording the slate and associated metadata that would be leveraged by ILM in their postproduction pipeline. ILM developed a low-res scanning tool for live set integration within StageCraft, taking advantage of Profile’s capture system to calculate the 3D location of a special iPhone rig, recording the live positions of a specific point on the rig to generate 3D points on live set pieces. The Razor Crest set, a partial set build, used this approach, with the rest of the ship a virtual extension into the LED wall. Controls on an iPad triggered background animation, giving the impression the physical Razor Crest set was moving. Once imagery had been rendered for each section of LED wall, the content was streamed to the Lux Machina team, which QC’d both the live video stream and the re-mapping of content onto walls, [which] Fuse Technical Group operated and maintained.”
You want to know something nuts?
They were able to control all of this remotely, using just an iPad.
The team that controlled all of this was called the "Brain Bar."
To pull off the rest of the shoot, they came up with a new way to do business on set.
Let's got through some steps:
Sheesh.
Credit: Francois Duhamel, SMPSP
Series Director of Photography Barry “Baz” Idoine was used to challenging shoots, but nothing could prepare him forThe Mandalorian. Mostly because everything they were doing was new.
He told ICG, “Seeing this 3D rear-projection of a dynamic real-time photoreal background through the viewfinder is tremendously empowering. It’s phenomenal because it gives so much power back to the cinematographer on set, as opposed to shooting in a green screen environment where things can get changed drastically in post.”
They mess up, make changes, and do reshoots, but they're all in this together. And the team embraces that.
So many teams have to come together to get this kind of tech to work on screen, but everyone involved understands that they're doing something new, cool, and important.
They're proud.
Fraser resoundingly supports them working in uncharted territory. “Technology serves us; we don’t serve it. So, it doesn’t make sense, to me, to embrace something just because it is new. But if it can help us do things as well as if we were doing it for real, but more economically, that makes good sense. Every day [on The Mandalorian] we were making decisions on how to go forward with this process, so it was like history evolving as we worked. Each of the various departments works independently, but at the end of the day, they had to stand together.”
Credit: Disney+
As Unreal Engine gets better and better, there's no telling where they can take the tech. As far asThe Mandaloriangoes, the plan is to try it on more character works. Right now the engine isn't great at doing masses of people. They'd like that to change.
They also think the tech can be hard to change when the story changes. Right now, visuals get approved months in advance, so if the story changes or ideas change, that can be a big setback.
Still, every day the tech gets better and more powerful.
Season one challenges won't be the same in season two or beyond.
Count me in as one of the most excited to see how this is used in Star Wars and across all Disney's franchises.
It's an exciting time to be a movie and TV lover.
The Mandalorian is all the rage on Disney+, but there's a growing debate over whether or not it's a western or samurai show. So which is it?
Click to find out.
Learn how to turn your score into a character.
I was thrilled when the opportunity to compose for Mikko Mäkelä’s sophomore filmSebastian came along. An arthouse drama about modern sex work that avoids judgment and generic queer tragedy, armed with delicate yet powerful storytelling and striking visuals.
The film tells the story of Max, a Londoner writing his debut novel about sex work in the digital age. He draws inspiration from "interviews," which actually means he sells sex to older men under the name Sebastian. As he struggles with writing and keeping his nightlife a secret, things begin to collide, and the stakes get higher.
- YouTubewww.youtube.com
Sebastian is a co-production between the UK, Finland, and Belgium, and the Finnish producer Severi Koivusalo was the one who first introduced me to the director, Mikko Mäkelä. I noticed early on that we shared a common vision for the score. We discussed the characters and how the music could enhance the film, especially in scenes without dialogue. The creative process was pretty straightforward, and in the blink of an eye, we found ourselves at the world premiere at the Sundance Film Festival 2024, whereSebastian was selected for the World Cinema Dramatic Competition.
I wanted to create a strong and unique musical identity forSebastian—something that truly draws you in and leaves a memorable sonic imprint on viewers. Reflecting Max's private nightlife, the score needed to be delicate and minimalistic. An electronic approach felt appropriate due to the urban pulse of London. My strong background in electronic music production helped me get started, while my roots in the DIY punk rock scene made it easy for me to be creative and playful with the music. Although the result is quite ambient and mellow, my methods and the urgent desire to create something unique and original stem from my teenage punk years.
Ilari HeiniläCredit: Altti Heinilä
With the score, the director Mikko and I wanted to focus on Max’s alias, Sebastian. It was important to find the right mood and sound palette for those private moments Max experienced—a world that is intriguing, tempting, and uniquely his. The music needed to deepen those feelings and drive the story forward. We aimed to capture the sense of urban loneliness Max felt, allowing it to resonate. Delicate nuances became more impactful when the score was minimalistic and mystical. I’ve always loved films with a minimalistic yet strong musical identity, such asDrive,The Lighthouse, andMidsommar. The score creates a unique universe where you can immerse yourself and feel the vibrations of the emotions within the story.
Overall, I describe this music as organic electronica. It feels organic, with many acoustic sound sources, but I approached the production from an electronic perspective, utilizing electronic music production techniques. I recorded plenty of contrabass and then sampled and manipulated it—my most trusted tools for transforming the sounds were the old-school hardware effects Dynacord VRS-23 and Echocord Mini. There are moments when you can definitely hear something "acoustic," but you may not recognize what it is. Most of the time, it’s the contrabass played by the talented Helka Seppälä or a violin brought to life by the equally great musician Milena Törmi. When mixed with synths and ambient sounds, these acoustic elements evolve into something very interesting and original.
I believe that silence plays a very important role in my music—it's a powerful tool that provides space and air to the tracks. When there are pauses, the musical parts resonate more strongly. This concept was always on my mind; it’s like holding your breath as the tension builds.
Another challenge was limiting my tools. Creating something strong and memorable is often easier with fewer options. This forces you to be creative and really consider the meaning of the music. When you have too many choices, you might lose your focus, and the outcome may not align with the original idea. My main tools were the contrabass, violin, piano, Korg MS-20, Prophet ’08, and the previously mentioned hardware effects. While I did use plugins for editing and mixing, these five instruments and two vintage machines were my primary creative tools. The final mixing and mastering of my tracks were completed by my Cannes award-winning colleague, Petja Virikko, at El Camino Helsinki.
I am a very visual person, and I perceive music and sound design as one vast, deep blue ocean. InSebastian, this ocean is warm and beautiful, yet simultaneously sad and tempting—the next film could introduce a darker, more thrilling soundscape.