LONDON (Reuters) – How do you make a computer-generated lion, warthog and hyena look real while singing and talking at the same time?
FILE PHOTO: A person takes pictures ahead of the World Premiere of “The Lion King” in Los Angeles, California, U.S., July 9, 2019. REUTERS/Mario Anzuoni
That was the challenge a visual effects team of more than 1,000 people faced when working on Disney’s “The Lion King” remake of the 1994 animation, aiming to bring to life the much-loved characters, Mufasa and Simba, against a stunning but fake African savannah backdrop.
Last year’s second biggest grossing film, with global cinema takings of $1.6 billion according to tracking firm Box Office Mojo, “The Lion King” received a widely-expected visual effects Oscar nomination on Monday for its “photo real” digital imagery that makes it look like a wildlife documentary.
A mammoth team worked on the Jon Favreau-directed movie, produced with computer animation, virtual reality, gaming technology and live-action methods. Visual effects company MPC, owned by technology and entertainment company Technicolor and with studios around the globe, was tasked with making the tale of lion cub Simba look like it had actually been filmed with real animals in Africa.
“One of the biggest things you’re responsible for is breathing life into these characters … They have to behave real and they have to look real,” Adam Valdez, MPC visual effects supervisor who spent two and a half years working on “The Lion King”, told Reuters in London.
“And if the two things are ever out of whack with each other, it breaks the movie.”
Valdez, who won an Oscar for Favreau’s Disney remake of “The Jungle Book”, traveled to Kenya for research.
“We do a lot of painstaking research into how real animals move, how their muscles and skin behave … and then in the computer, we recreate all these things,” he said, adding the MPC team first created designs of the characters and landscape.
“An artist, like an animator, has to sit down and actually hand animate that eye, hand animate that face so that every little subtle nuance is represented.”
Work on the movie took place in London, Bangalore and Los Angeles, where a headset-wearing Favreau and his team worked on a virtual reality set, using gaming technology to direct the scenes.
“The game … that we wrote was about people walking around the savannah of Africa,” Valdez said.
“Instead of holding a game weapon they’re holding a camera, and they’re able to point the camera at the things they want to film and we’re recording all this information in the computer.”
Valdez said each shot was then carefully recreated to make it look as realistic as possible.
The team used recordings of the cast, which included Beyonce and Donald Glover, voicing their roles for scenes with dialogue.
“We were very subtle with it … Every tiny adjustment to say just how much eyes open or close, how much eye whites you see on the inside or outside of the irises, they all have an emotional meaning for us as humans,” Valdez said.
“But if … you push it, you snap the connection of the photo realism of the image that you’re seeing with behavior that is of a different sort … It was a very tricky thing.”
Reporting by Sarah Mills; Writing by Marie-Louise Gumuchian; editing by Diane Craft