
Jay Stobie, writing at ILM.com, talks to ILM Visual Effects Supervisor Dave Zaretti on the companies contributions to The Lost Bus, the riveting film inspired by a compelling real-life story.
Inspired by the events of the 2018 Camp Fire in Paradise, California, and based on Lizzie Johnson’s 2021 book “Paradise: One Town’s Struggle to Survive an American Wildfire,” The Lost Bus (2025) follows Kevin McKay (Matthew McConaughey) and Mary Ludwig (America Ferrera) as they attempt to shepherd 22 children to safety aboard a school bus during a chaotic evacuation. Directed by Paul Greengrass (The Bourne Ultimatum[2007], Captain Phillips [2013]), who co-wrote the screenplay with Brad Ingelsby, the film puts the audience in the front seat on a harrowing ride through the awful inferno.
Industrial Light & Magic’s Dave Zaretti (WandaVision [2021], Willow [2022-2023], The Acolyte [2024]) generously took time to sit down with ILM.com and discuss his role as the project’s ILM visual effects supervisor. In a wide-ranging conversation, Zaretti touches on studying real-world references, crafting intricate assets and environments, paying respect to those who endured the tragic events depicted in The Lost Bus, and much more, detailing every level of ILM’s extensive contributions to the captivating film.
Real-world References
As the ILM visual effects supervisor, Zaretti operated out of the London studio and oversaw the company’s work on the project. “I think ILM had the lion’s share of the visual effects work for the second part of the film,” Zaretti tells ILM.com. “I supervised all of the London work, and we had work from ILM’s Mumbai studio that [ILM associate visual effects supervisor] Steve Hardy supervised. My daily role was to keep an eye on the overall vision of the show and check in with Charlie Noble, the client-side visual effects supervisor.”
Noble supplied ILM with what Zaretti describes as a nearly two-hour-long “megaclip” of reference material, which helped ILM ground their visual effects shots in realism. “I’ve never had so much reference on a show,” Zaretti emphasizes, “because this event occurred in 2018, and everyone had phones and cameras. It was a very filmed event. A pickup unit also went to Paradise and filmed loads of additional footage of the actual place itself. In terms of the environment and road layouts, we spent a long time on Google Maps ensuring we got all of the turns in the road correctly.”
ILM broke up the reference megaclip into separate asset types. “We had shots within the smoke cloud, shots outside of the smoke cloud, big fires, small fires, brush fires, trees burning, houses burning, and cars burning,” Zaretti recalls. “There were fire tornadoes; I didn’t even know that was a thing! We were spoiled with references to the point where the references began to contradict each other because there’d be wind blowing in two different directions in the same piece of footage. Those contradictions actually helped us in certain scenarios. For example, during the escape sequence when the bus is winding down this thin road, there are points where we had flames licking at it from both directions to portray the danger that they were in. So, having a real reference to back up your visual effects often helps.”
Assembling Assets
The abundant reference footage proved incredibly useful; although, its varied nature also meant ILM needed to build a wealth of digital assets to fully represent the range of material. “We had about seven types of fire assets,” Zaretti says of the meticulous process. “You’d think fire would count as one thing, but no, we had small, medium, and large fires. We had traveling fire that was needed to show the spread of fire through the grass. There were several types of smoke because vegetation has a lighter smoke, while houses and cars have a darker, roiling, inky smoke.” Surprisingly, minute embers seen amidst the chaos are burning or smouldering pine cones, reflecting the heavy debris encountered throughout the 2018 fire.
“We built a CG bus, as well as the digital cars we used on the multilane Skyway. We added to the cars they already had to create a sense of everything being jammed together, with people nudging and jostling.” Though a practical bulldozer was captured on location, the ILM team lobbied Paul Greengrass to supplement a fully digital version, which Zaretti notes “looked fantastic. Our team at ILM’s Sydney studio built that for us.”
ILM constructed large environments for the project, such as Roe Road and the Skyway. “We had to build all of that. There were other environments here and there, too. We had to augment Pearson Road’s falling power cables. Paul [Greengrass] did some practically, but we added more to get a chain reaction of the cables coming down and whipping around. We did nine or ten species of trees. We had variants of each species, and for each variant, we had different strength winds going through several levels of fire. We must’ve had hundreds of tree variants in total, as well as bushes and grass. One of the biggest technical challenges was the scale of the event,” Zaretti advises.
“When it came time to design the shots, we already had these component parts and could drop them in. Then, when we needed to do a hero simulation for something in the foreground, we could simulate that,” Zaretti continues. “Our effects team was amazing, and Billy A. Copley was our effects lead. [ILM digital artist] Matthieu Chardonnet devised a really cool fire setup where, once we had a look that we liked, we could essentially paint where we’d want the fire to be on the trees. Within a few hours, we’d have a first pass of how it’d look. Normally, doing that work would’ve required a couple of days turnaround.”
Dynamic Details
Zaretti and the ILM team considered their visual effects work as another tool the director could utilize to tell this immensely important story. “Despite the fact that some of the shots are perhaps 90% visual effects, they were always supporting what had been filmed. We viewed the fire, smoke, and embers as a character,” Zaretti explains, noting that Paul Greengrass felt the ember cam shot was the film’s “Jaws moment.” “The fire is coming for you, and there’s this impending doom. It is the protagonist in the film – it’s the baddie.”
The scene in question finds the camera weaving through the forest from the fire’s perspective, as the smoke and embers close in around the trapped bus. “That was a big operation. The nature of Paul Greengrass’s dynamic camera shots meant that you never focused on anything for too long. Suddenly, you come to this calm, slow, and long piece of track material through the woods. It was entirely digital because the filmmakers wanted to completely art direct where they went. We had an environments build for our forest, which we used for the rest of the show, but we increased the resolution for this particular shot and simulated all of the grass,” Zaretti declares.
Incorporating the original 2018 fire conditions increased the scene’s realism and sparked a somber thought for the ILM artists. “The reason the event was so devastating was because the extreme winds made the fire spread, so we always had to convey a sense of high-directional wind by simulating the grass being blown. The environment looked so good, and we wanted to show it off, but it needed to be dark. We had to bring it down and use the distant firelights to illuminate it. We wanted it to feel realistic, which was enough to make it terrifying. It’s difficult because I kept getting excited about our shots – but then I reminded myself that this actually happened and must have been absolutely horrifying,” Zaretti states.
Read the article in full here.
Images: ILM/Apple
The post Rendering a Rescue: ILM’s Dave Zaretti on the Visual Effects of ‘The Lost Bus’ appeared first on Jedi News.






