We look at the strange case of CGI, which for an array of reasons, appears to be getting worse. Why is shoddy CGI deemed acceptable in massive-budget movies?
It’s been 30 years since Jurassic Park really kickstarted a major fascination with computer-generated imagery. We’d seen some revolutionary CGI prior to that with films like Terminator 2: Judgment Day and the stunning liquid metal effects of the T-1000. Spielberg’s Jurassic Park also depicted dinosaurs in a way we’d never before seen. The possibilities of what could be designed and animated on a computer seemed limitless.
A decade later, we had The Matrix Reloaded burly brawl which treated audiences to the prospect of fully digital replacements for actors. Did it look entirely convincing? Nope, but we watched with half a mind on just what might be achieved within another decade. Let us not forget the original film’s trademark bullet time, a clever blend of CGI and elaborate 360-degree still photos.
James Cameron has always been a technological pioneer, at the forefront of cinema and Avatar broke new ground in creating entirely digital worlds, further exemplified by the stellar work in the sequel. Here’s the thing. Let’s go back to the T-1000. There have been four Terminator sequels since, from 2003 up to 2019. That’s 12-28 years after Cameron’s trailblazing first sequel, and yet the second film in the franchise still has by far the most impressive visual effects. It’s the same with the CGI in Jurassic Park compared to every single depiction of a dinosaur on screen since. Why?
There are a number of reasons but a significant one is the less is more strategy employed in the early days of companies like ILM, and in unison with the production company demands. This burgeoning technology could amaze but we weren’t always given enough time to look at these things to start seeing the flaws. Spielberg also reverted to his approach to making Jaws in 1975. He maintained suggestion and built a sense of anticipation before the moment you finally saw the shark.
In Jurassic Park there’s a lot of build-up to before we catch a glimpse of a dinosaur, teasing the carnivorous eating habits of the T-Rex before we even glimpsed the big menace. He rationed out those moments of CGI and loaded the majority of them toward the back end of the picture. Ah build up, pacing, you know these things of a bygone time. Then the film also brilliantly combined CGI with practical effects. So alongside those fully digital shots of raptors, the T-rex and the venom-spraying thing that killed Newman from Seinfeld (Wayne Knight to the layperson), we had superb animatronics to help reinforce the illusion that dinosaurs were walking the Earth.
The best exponents of early CGI used it sparingly and at key moments where an invested audience would be sure to be impressed. Step forward to the past decade and we’ve seen a massive shift in how CGI is used in films. There are of course, still good examples and those moments where they’re so stealth an audience might be none the wiser, but usually, these come in films predominantly built on the meat and potatoes approach of practical filmmaking. Recent stellar examples might be Matt Reeves’ grimy, down-and-dirty The Batman film which utilised CG backdrops that were actually projected on a screen on set (as opposed to the more atypical green screen approach most films use).
The Batman was a film filled with texture, elements like fire and water and evocative noir lighting. It enveloped us in this turgid world that felt physical. The antithesis to that is many of the recent Star Wars films and TV which rely too heavily on greenscreen and an overabundance of CGI which never really let us feel like we’re in a physical place. CGI is often becoming more a point of ire than it is something which impresses. The Flash, alongside all the inherent controversy, was obliterated by viewers for its terrible CGI.
Studios now seem to operate on a more-is-more mentality. A CGI double of whichever comic book hero we’re watching will be performing rubber-limbed, slightly odd manoeuvres as they bound through the air not looking remotely like a physical entity. Often they’ll do this against a CGI backdrop that feels flat, or with a swirling ‘camera’ that’s more vertigo-inducing than engrossing. The sprites on the screen are doing too much. The camera is doing too much. The frame has been filled with too much and because of what we’ll discuss shortly, it all looks awful.
Maybe younger audiences more accustomed to this don’t care, but I suspect the somewhat shock at Top Gun: Maverick’s dazzling aerial feats, suggests that newer audiences were pleasantly thrilled by how real those sequences felt. Guess what kids, this is how films were made when I grew up. Indiana Jones having a train top chase, or jumping from a horse to a tank, or outrunning a large boulder, or leaping a chasm, or…whatever…these were stunts captured in camera on locations or sound stages. Yeah, it’s safer to shoot a (moving) train top chase on a green screen or just create an entire shot digitally, but it looks abysmal.
A lot has been said of Indiana Jones and the Dial of Destiny. Some people have labelled it one of the most expensive films of all time. Some have also been impressed with the de-ageing effects that showed Harrison Ford as he was in his 40s. The opening of the film, set just prior to the end of WWII is an overlong sequence that never ever lets us feel like we’re in the time and place.
Now, when Indiana Jones is on screen as the 40-year-old version, he looks okay in parts. If Indy isn’t moving much or saying anything and stays very still, it looks good. It’s also clear of all the CGI throughout the movie, the most time and care was dedicated to this. However, have him move, his eyes move or have him speak and the illusion shatters and the result kind of haunts my nightmares. Yet, this is the highlight as what follows is a collection of horrible CGI which is often badly composited. It makes the film look cheap, and yet this is apparently one of the most expensive blockbusters ever.
In The Last Crusade, our rollicking opening action sequence takes place on a train as young Indiana (played brilliantly by River Phoenix) is chased by bandits. It’s filmed on a real train with the actors and stuntmen on the top of the moving train. It’s exciting, comical and has a great payoff. Not so the train sequence in Dial of Destiny which features shots of a rubbery and oddly short-looking CGI Indy, bouncing awkwardly across a train top at night with way too many added atmospheric flourishes added to give the whole thing a dreaded modern motion blur you get from 90% of digitally shot big budget blockbusters. It’s a big culmination of not only the digital photography but all the CGI animation and elements, the grading and shows how awkwardly these things can often mesh together.
Yes, there’s just way too much going on. We’ve also seen an increase in troubled productions, delays (COVID related in some cases) and a host of CGI artists and companies complaining about being overworked (and underpaid). Disney as one of the major monopolies of cinema has harvested a large number of artists and spread them thinly across their mass of disposable content. Does Disney care if She-Hulk looks half-finished? Do they care if, even after delays and extra time in post, Indiana Jones’ final adventure feels slipshod? As the recent strikes suggest, much of the ire of which is pointed firmly at Disney, of course, they don’t. Audiences only bemoan the poor quality in passing, perhaps accepting that this piece of content is going to be replaced by the next equally disposable piece. Does it matter if CGI peaked 10 years ago and is on a downward slope? Is the next fascination AI? Have I used up my rhetorical question quota?
Maybe it’s just me who misses the old-fashioned approach to making spectacle-driven movies, where stuntmen were captured doing impressive things in shots which had a simple clarity to them. I’ll say that I enjoyed watching Ford in his final fling as Indy, even despite the technical flaws. What I really did miss though was a rugged, dirty feeling of actually being in the locations the movie takes place in. In Raiders you felt the dust and sand and could see Ford physically present in brilliant locations and sets. Action scenes didn’t overstay their welcome either as is the case with so many modern blockbusters that have seemingly never-ending set pieces.
The question is, why have a 15-minute action scene with bad CGI that exhausts the viewer, rather than prioritising making a 5-minute sequence that more effectively blends practical work with CGI? The answer evidently, is simple…just do what Tom Cruise or Christopher Nolan are doing. In the end, a film might cost $300 million but bad CGI will just make it look cheap and this is becoming the tentpole norm and not the exception. One of the films I wrote, Firenado was shot for about $50,000. It had CGI that isn’t gonna take home any VFX Oscars but honestly didn’t look too far off some of what Disney deems acceptable on films that spent more than that on catering.
What are your thoughts on CGI in modern blockbusters? Let us know on our social channels @FlickeringMyth…
Tom Jolliffe is an award-winning screenwriter and passionate cinephile. He has a number of films out around the world, including When Darkness Falls, Renegades (Lee Majors and Danny Trejo) and War of The Worlds: The Attack (Vincent Regan), with more coming soon including Cinderella’s Revenge (Natasha Henstridge) and The Baby in the Basket (Maryam d’Abo and Paul Barber). Find more info at the best personal site you’ll ever see here.