From the second MTV went on the air in the summertime of 1981, music movies have been identified for his or her boundary-pushing ideas and cutting-edge manufacturing strategies. For many artists, it displays and informs their signature fashion.
will.i.am is certainly one of them. The multifaceted famous person and entrepreneur is a passionate know-how explorer whose pursuits have led him to design smartwatches and iPhone apps, going so far as to even make interplanetary music historical past as the primary individual to stream a music from the floor of Mars.
No shock that “Let’s Go,” his most up-to-date music video collaboration with J Balvin, finds them traversing fantastical environments in an ultra-slick race automotive. The collaboration between the groups at Huffman Creative, JAMM VFX, and The Storyteller’s Desk, harnessing applied sciences together with Unreal Engine, Houdini, Maya, Adobe Creative Cloud, and Frame.io, resulted in a mind-bending XR expertise that after once more pushes the artistry of music movies to new heights.
A high-concept video
VMA-winning director Andrew Donoho, identified for his work with a who’s-who of artists that vary from Beck and Paul McCartney to Run the Jewels and Skrillex, had beforehand labored with J Balvin. His video for “Toretto” options Balvin exhibiting off his drifting abilities for a duly impressed Vin Diesel.
So when the idea for this video got here up, Andrew was a pure selection, and Balvin is as soon as once more behind the wheel as he races will.i.am via 5 not possible environments.
The idea was to create an XR (prolonged actuality) expertise that hewed to a extra real looking aesthetic versus wanting prefer it’s a part of a online game. The viewer ought to really feel as if the automotive is in an natural setting, with pure lighting and reflections that promote the phantasm.
“Will and I connected directly and spent a couple of months building the creative from the ground up and bouncing ideas back and forth,” Andrew says. “In his music videos he exists in a kind of near-future environment and he really likes science fiction and new tech, so we wanted to do something that had a lot of different backdrops and locations. It needed to be shot all in one day, and we were dealing with a car that had a lot of chrome, which meant there would need to be a lot of reflections.”
“The idea was that we should use an XR stage with a volume and Unreal Engine so we could build out an elevated version of his sci-fi world and could also incorporate a lot of variety in the lighting that would accommodate this reflective world and elements that really wouldn’t be able to exist on green screen,” he provides.
If you’ve ever labored on music movies, you realize that they’re extremely difficult to tug off. Especially at this finish of the spectrum, they’re one thing akin to a Super Bowl business, demanding ultra-high manufacturing worth, however on an usually compressed schedule or lowered funds. So once you’re speaking about embarking on a venture like this, you want loads of ingenuity and a few very succesful fingers.
When in search of the right design workforce, Huffman Creative’s Head of Production, Katie Sarrels, reached out to JAMM’s Executive Producer, Julie Weitzel. From earlier experiences with different high-level initiatives, Katie knew that JAMM could be the perfect workforce to assist execute this idea flawlessly. With Huffman dealing with inventive concepting and the manufacturing, and JAMM dealing with the VFX, they introduced collectively the right combination of expertise and brains to tug it off.
Planning is essential
The workforce at Huffman Creative aren’t any strangers to advanced productions. As a full-service inventive studio that handles every thing from preliminary idea and bidding to location and know-how analysis, writing, expertise contracts, and all phases of manufacturing (sensible and digital), in addition to the complete spectrum of post-production wants, their work spans commercials, music movies, pictures, stay occasions, and extra. And with expertise relationships with artists like Bad Bunny and Ariana Grande, sports activities figures like Mike Trout and Damian Lillard, and quite a few different influencers together with TikTok star Bella Poarch, they’ve produced numerous initiatives with viral outcomes—and amassed prestigious awards alongside the best way.
As a frequent collaborator with Andrew, Executive Producer and Company Founder Ryan Huffman knew that though this explicit video would include its personal distinctive set of challenges, he was assured that after once more they might assemble the fitting workforce and options to sort out it.
Although Ryan tends to operate as EP on a day-to-day foundation, he additionally lent his experience as a producer and conceptual developer for this venture. “Because of how complex XR is, you have to go through a lot of the stages of pre-production in order to lock the budget,” he says. “On a traditional shoot you might be able to say, ‘We’ll just get a mansion and figure it out from there,’ but for a project like this you have to go deep on storyboarding and other types of concepting before you really know how much it’s going to take to execute.”
“On a traditional shoot you might be able to say, ‘We’ll just get a mansion and figure it out from there.’”
It meant that loads of key gamers needed to be concerned early within the course of, together with manufacturing designer John Richoux, set designer John Doni, and artwork director Nick DeCell. “From a creative standpoint you have to go very deep to make sure that we’re presenting something that’s feasible and looks good,” Ryan says.
John Doni utilized Photoshop and SketchUp within the previsualization stage, pulling imagery and textures from Adobe Stock to create the visible palette that will be used not just for the digital backgrounds but additionally for the precise bodily components they constructed on the set. Between the inventive workforce being scattered throughout Los Angeles and can.i.am touring, the various collaborators relied on Frame.io to remain in sync creatively.
“I actually didn’t meet with Will in person until the day before the shoot,” Andrew says. “We would cut the storyboards in sequence to the music so we could share them with him and he could drop notes directly on the frame. Same with the mockups of the environments and renders from JAMM.”
will.i.am had executed a earlier video in an LED quantity however needed to push the method even additional with this one. Andrew estimates that they spent about two and a half months previous to the shoot to get every thing ready. “We built visual decks and treatments and used AI for some of our concepting, and then took that over to JAMM to start building them into a physical reality in 3D.”
Starting the (Unreal) engine
Working with new know-how requires a gradual hand on the wheel, and VR producer Tom Thudiyanplackal, an skilled Unreal Engine filmmaker, was introduced in to steer the XR part together with his firm, The Storyteller’s Desk. As a member of the USC Entertainment Technology Center (ETC), his work on the Cannes award-winning scholar movie Fathead was documented within the ETC’s subsequent white paper, during which Tom particulars the flowery technique of pushing the know-how to new locations as a check case for future mainstream productions.
Like this one. After prolonged technique of interviewing every VP stage in Los Angeles, the workforce selected XR Studios, which has two phases—one with an LED ground and one other with a sensible ground—permitting them to seize every thing in a single day. It additionally enabled them to make use of sensible set items and supplies that will assist promote the real looking look Andrew was after.
Andrew explains: “Because you’re trying to capture everything in camera—the lighting and shadows, the reflections, the textures—what XR offers is that the art department was actually able to build out what the talent were standing on, what they were touching, what was around them. The screen then provides the backdrop and they can merge that floor plane and that ground plane with the background. We color matched the seams so that the practical sand flows into the sand of the background world.”
But, once more, this requires prolonged and meticulous preparation. Tom labored carefully with VFX supervisor Troy Moore and his workforce at JAMM as they created the fashions and environments in Houdini that will play within the quantity, determining the small print of what it will take to correctly venture the pictures on the LED partitions.
“Houdini is a great tool for procedural generation and it’s wonderful for a director or creative person to work with an artist and be able to see their world come to life, but it’s not such a great tool for real time,” Tom says. “The main challenge is that the file sizes are pretty large and the file types may not be compliant with the pipeline you need for real time. So you first have to bring that content into Unreal Engine and basically shave it down so it still holds all of the beauty that was visible within Houdini.”
“The artists would export the meshes and textures from Houdini and reassemble pretty much everything within Unreal Engine to optimize it so that we get a lag-free performance on the wall. The general math is that if you’re trying to hit about 30 fps on the wall, you try to hit about 90 fps on your computer, so that when the sequences pass through the end display pipeline—even when you have some loss of processing and frame speed—it still delivers 30 fps on the wall.”
Working in live performance
This additionally required the JAMM and the Unreal workforce to work forward of time with Andrew and cinematographer Idan Menin so they might precisely create the backgrounds to match into what the digital camera would seize. “We like to be more disciplined within virtual production—to have an idea of how much the camera’s going to move, what kind of lenses are going to be switched to during a shot so that when you build the environment you know where the virtual edges will fall off,” Tom provides.
Andrew and Idan, who collaborated on two earlier initiatives, selected to seize on the brand new Sony Venice 2 (recording at 8.6K X-OCN ST), which Andrew describes as “a real treat. Its dual ISO also allowed us to brighten everything in camera without losing anything, so that way we really could use the screens to illuminate the subject. One of the big advantages of the LED volume is that when you catch the sunset behind the talent, it will actually illuminate them the same way that a real sunset would.”
The workforce additionally used a Sony FX3, recording at 4K ProRes RAW. And then there have been the lenses. “We used the same 70mm anamorphic lenses that were used in The Hateful Eight and The Mandalorian,” Andrew says, with real pleasure. “It’s a really fun approach because you take this very old, gorgeous glass that has a huge scope and interesting personality, and then you put it onto this world and it really helps ground it in reality.”
“You have all this new technology, these new cameras, everything is super crisp and sharp and beautiful. But then when you put these gauzy vintage lenses on there, it chips away at the digital edge, it makes it feel even more organic, which again is one of the things you can’t do on green screen, because you want everything to have crisp edges for keying. But with XR you’re able to get those lens flares, you’re able to get the softness, you’re able to let the lenses bend a little bit because what you’re capturing in camera is your actual shot with the visual effects included.”
“You’re able to let the lenses bend a little bit because what you’re capturing in camera is your actual shot with the visual effects included.”
Tom confirms that the choice of vintage lenses additionally helps to masks any of the much less performant elements of the amount know-how. “Anytime something’s not 100 percent there, filmmakers have a wonderful way of masking or working with it. With any form of lens, cinematographers will never call the imperfections of the optics of the lens imperfections,” he states. “They call them characteristics.”
“In this case they only added to the whole process because the walls are not necessarily at a place where they’re 100 percent ideal for film work. We don’t want to look at those things with too keen an eye because it won’t be perfect. So in the case of using these anamorphic lenses, they added great character to the image and made it much more seamless in terms of the integration of the physical world with the digital world.”
After concepting, storyboarding, creating the 3D components, and adapting them for playback on the amount, there’s yet one more vital step: taking the time to prep on the phases previous to rolling the digital camera. While the environments themselves didn’t have to be rigorously timed for playback, there are moments within the video during which, for instance, the lighting goes from daytime to sundown or the automotive goes into and out of a tunnel.
“Elements like that needed to be timed to the music as best we could,” Andrew says. “JAMM was able to focus on the visuals and the Unreal team made sure that we had the flexibility we needed when we started dialing in the specific lighting cues and movement speeds. Thankfully, they had built us a bunch of animation elements that showed how this would look in motion so that we had a strong reference going into timing them.”
Idan can’t overemphasize the significance of preparation. “When shooting a virtual production it is so crucial that there is interdepartmental communication. The VAD (Virtual Art Department), art department, lighting, camera, and stage teams must all work in concert to pull off photorealism in camera. The other challenge was doing all that within the constraints of a non-volumetric virtual production stage,” he explains. “There is a common misnomer that all virtual production stages are called “volumes.” This is barely true, nevertheless, if the stage itself consists of a 360-degree wall and ceiling. Although XR Studios had the 2 phases, which allowed us to seize all our setups in at some point, they didn’t have 360 levels of LED wall.”
Which meant that his workforce additionally needed to manipulate the lighting to meld the bodily world with the digital one. “We were tasked with continuing the walls of the stage with traditional lighting tools, which proved challenging both technically and logistically. We brought in and built numerous custom frames of muslin with creamsource Vortex 8 lights that continued the lighting effects of our wall, while being short enough in height to not block the collection of witness cameras from tracking the Sony Venice. Having this array of Vortex lights allowed our programmer to simulate, and when necessary, animate lighting cues in sequence with the LED wall,” he says.
Tom particularly cites the house dance sequence as one instance of how they pushed the know-how to realize wonderful illusions. “The stage that has the LED floor integrated with the wall is a very sophisticated technology where, from the point of view of the camera, it’s as if the floor and the wall disappear and you start to peer into the dimensionality of the 3D world. will.i.am and the dancers were on top of a physically constructed platform on top of the LED wall and from the perspective of the camera we were able to create the illusion that they were floating through space among the buildings and a huge city—all in camera.”
Cutting the fitting corners
If you’re protecting depend, there have been 4 separate groups that wanted to collaborate carefully in the course of the pre-production and manufacturing phases (Huffman/Andrew, JAMM for VFX, The Storyteller’s Desk for Unreal Engine, and XR Studios)—in addition to the expertise and their administration.
And then there was post-production, with Andrew himself modifying, further VFX work by Denhov Visuals, and colour grading by Matt Osborne and the workforce at Company 3.
During each section, Frame.io helped hold the groups heading in the right direction. But in post-production, it performed a good greater function.
“We put a lot of the budget into the previs and set up and XR,” Andrew says. “The goal was to walk away from production with something that was almost there, so at the end I put together a smaller, scrappier team for post. I have a background in visual effects and the Adobe tools, and I know how to mix and match the software to make it efficient.”
It’s uncommon for a manufacturing to emerge with out having encountered detours or roadblocks. In this case, the race automotive they have been utilizing was an earlier mannequin. And as a result of this video can even function a promotional piece for the racing entity itself, the workforce was required to replace the automotive in put up.
Denhov Visuals labored primarily in After Effects and Nuke to make these changes, and Andrew relied closely on Frame.io to go away frame-accurate annotations and notes on the work with that workforce—in addition to for sharing cuts and property as he edited in Premiere Pro and DaVinci Resolve.
“There’s nothing else out there where I can upload all the raw footage and then add comments to it and still make it seem like something that’s presentable.”
“I love that I can draw on the frame to very specifically point out what needs to be removed, or what needs to be brightened or darkened within our actual edit. We were adding the car parts and the artists were able to grab screenshots and downloads from Frame.io at different aspect ratios and file sizes so that they could do previs on stills.”
Andrew provides, “Frame.io is also amazing for huge file dumps. There’s nothing else out there where I can upload all the raw footage and then add comments to it and still make it seem like something that’s presentable. And then as we got VFX cuts in and elements in from our post teams we’re able to again seamlessly reference old edits or notes, which makes the QC process so much easier. Because unless you have a massive team or infrastructure to manage all the files and assets, you need to have it all in one place so that everyone can get what they need.”
Throughout the modifying course of, Andrew was additionally sharing cuts with will.i.am and his administration workforce. “Sending them through on Frame.io gave them the chance to ask questions and to respond to my notes,” he says. “I also really love that instead of the traditional placing text on screen for explaining what’s going to change and what the effect will look like in the end, you can just drop that into the review section and they can see it right there.”
Present, previous, and close to future
If, as Andrew says, will.i.am exists in a near-future world, “Let’s Go” took the complete workforce to the near-future of the place movement image know-how is heading. Yet, Andrew appreciated the aesthetic of utilizing classic glass with model new digital cameras.
Along those self same traces, there’s a side of working in digital manufacturing that leans decidedly towards strategies which were utilized in tv for many years.
“The one thing to keep in mind about virtual production, especially when you work with LED walls, is to only trust what the camera sees,” Tom states. “We have come to a place where the wall is essentially an electronic signal and so is the camera sensor. So we’ve moved into the territory of broadcast and have to rely on the strengths of using scopes to know what the image is rather than our eye or a calibrated monitor. The habit is to fall back onto using meters but you can’t trust your eye. Trust what the camera sees.”
From the cinematographer’s standpoint, there’s additionally a change in mindset that happens with digital manufacturing. “In a way, it’s reversing some lazy habits that the industry has adopted in the wake of the digital revolution. The ‘fix it in post’ mindset does not meld well with this process and to pull it off in its best form, the culture of filmmaking needs to return to committing to choices up front in prep,” Idan states.
“The more we can prepare and make decisions early, the better set up we will be to be inspired, react, and pivot on the day of the shoot.”
“VFX is still a huge part of this process but it is there to enhance and build upon choices made long in advance of production. This project pooled together a wonderful team of people committed to making those decisions ahead of time and seeing it come together was inspiring. Going forward, I’d like to double down on the ‘prep matters’ mindset as I find that the more we can prepare and make decisions early, the better set up we will be to be inspired, react, and pivot on the day of the shoot.”
And then there’s the side of the best way groups are collaborating in our new actuality. “What once would have required all of us to be at JAMM’s office for sessions, then going to the Unreal team and then Will having to fly in to see things—as of 2020 that doesn’t really exist anymore,” Andrew says. “When you are remote and in our new 2023 world, it’s now possible to do a project of this complexity without being in person. It was definitely a very wireless workflow.”
Taking the teachings of the previous to push know-how ahead? You might virtually say that it’s just a little like creating a contemporary, super-performant race automotive.