The human eye is an incredible thing. In complete darkness, it can detect a single photon. Unfortunately, the last several decades of film are a sad series of underestimations of this ability. The overconfidence by filmmakers in new gadgets and tricks has led to a pox marking the face of cinema. I’m talking, of course, about computer-generated graphics (CGI). There has been, one must admit, decline.
Perhaps it’s just another example in an overall trend. Recently there’s been a lot of talk about modern architecture being pretty terrible, e.g., this analysis by Scott Alexander. Apparently there’s even a conspiracy theory, Tartaria, speculating that the beautiful cathedrals and art deco skyscrapers of the past were built by a more advanced civilization and the “elites” have covered up the evidence. Even if the theory is mostly a joke, it’s funnily effective precisely because it expresses a grain of truth. People do often prefer more classically-styled buildings, like the cathedrals of Europe, over today’s concrete and glass towers. Scott Alexander asks:
I think there’s a genuine mystery to be explained here: if people prefer traditional architecture by a large margin, how come we’ve stopped producing it?
Let me offer a comparable conspiracy theory. If you go back in filmmaking prior to somewhere around 2000, give or take a decade depending on the title and genre, almost all films look so much realer. It’s as if the camera is recording things actually happening in the world, and you’re watching those things happen through the portal of your TV. Older movies have a tangible quality that today’s lack. So my conspiracy is that, while we clearly once had the technology to make incredible practical effects, and shoot on real locations, a shadowy cabal of directors and producers have abandoned those techniques in favor of image post-processing and CGI and green screens, all in order to cut down the cost and effort of making, you know, actual movies.
So a conspiracy theory about lost tastes and preferences, not civilizations. And perhaps not really a theory. More like a sad fact. For the grain of cinematic digital unreality, especially post-2000s, is a curse upon our culture, a perfect example of how a frog can boil without noticing it.
The problem is easiest to catch in the sci-fi and fantasy genres. There it is tempting to make use of the full horrors of graphics and post-production processing. It’s undeniable that remakes, from An American Werewolf in London to The Thing to the A Nightmare on Elm Street, have vastly inferior CGI effects compared to the original practical ones. Indeed, this over-reliance on CGI has ruined some of the best genre series. Consider Peter Jackson’s decline from The Lord of the Rings to The Hobbit trilogy. Or The Matrix’s sequels, which eschewed the original’s acrobatic stunt wire-work and practical fight scenes for the digital realm (a cosmic irony if there ever were one). The old original Star Wars trilogy comes to us as dispatches from a gritty world—it is a tighter world, yes, the shots are necessarily smaller, more closely-framed, but it all feels more viscerally real. Ever since the release of the Star Wars prequels in the 1990s, the actors stand apart from digital ghosts they can’t see.
And it’s not like filmmakers use CGI solely for what was impossible to do in the past. In The Force Awakens shot above, is that pipe on the upper left even real?
Consider Ridley Scott’s 1985 Legend, wherein Tim Curry’s onscreen presence as a horned devil is heavy, weighty—despite playing a fantastical devil, his clopping around in hooves comes across as ontologically solid. Your eyes are not deceived. And moreover, under the slathered makeup and prothestics, he is still acting—when his lip curls, that’s a real lip. Compare that to Steppenwolf, from the 2017 Justice League. After almost a quarter-century of “progress” in cinema he comes across as digital dust, a void where physicality should be. His superstrength is unthreatening, because deep in our occipital lobes we know Steppenwolf weighs nothing, impacts nothing. Therefore the entire movie has no stakes. Of course the bad guy will be defeated, he’s a hologram.
The incredible lightness of modern cinema worms its way onto screen even when filmmakers do their best to film in a real location, not just behind a green screen. Consider director Denis Villenueve’s 2021 Dune, which was released to critical and public acclaim. It was lauded for its special effects, the supposed realness of its world, and Villenueve admirably went to actual deserts, like the United Arab Emirates, to film a number of scenes.
Yet despite his best efforts, I cannot help but notice that digital post-processing seems to have ruined Villeuneve’s more realistic vision. Lawrence of Arabia from 1962 looks so much more real than Dune, even when just comparing the shots of people in Bedouin costumes hanging out in a desert.
Notice the color differences. Every movie now takes place as if it were in The Matrix, appearing in deep greens or blues or in gray shades, like the complex colors of the world have been bleached out. And all those wrong little details. For instance, the shadows both characters cast to their right. In the 2021 Dune, it is a monochrome chunk, despite the billowing fabric around the character. In the 1962 Lawrence of Arabia, the shadow of the fabric is actually see-through. It has dimensionality, thinness and thickness. Because it’s a real shadow. So despite both being filmed in similar locations, one scene unconsciously triggers more belief in the ontology of the fictional world—the directionality of the light, the blue above the dune—while the other comes across as an ersatz reality.
Let me put it like this: there is a sensation universal to all humans. It is that feeling you get when you’re standing outside somewhere and look up to see trees blowing in the wind, or stumble across a stream running over rocks. As you listen to that gentle creaking, or that low-tone burbling, what you feel at that moment is the deep satisfaction that comes from witnessing Being, in all its irreducible and incomputable eddies and swirls. Your eyes are satiated. There’s probably a German word for it, and if not, there should be. But whatever that feeling is, modern films are as distant as can be from it. For after partaking in the empty calories of modern cinema, your eyes are still hungry.
So why? Why don’t movies look like reality anymore? Is my conspiracy theory of a sinister cabal of cost-cutters true? Not quite. Because the new digital cinema doesn’t save any money. In fact, it’s more expensive. Peter Jackson is a perfect case study: how could the same filmmaker go from making a movie as good as The Fellowship of the Ring in 2001 to a movie as bad as The Hobbit in 2012? It’s not budgetary. The Fellowship cost 93 million to make, while The Hobbit cost 180 million. And this seems to hold true across the board: Hollywood budgets are bigger than ever.1 Even accounting for inflation, Jurassic World in 2018 cost about 25% more to produce than Jurassic Park in 1993, and all the dinosaurs in the original feel ten times as tactile and threatening.
So I’ll pick a different answer other than a capitalist cabal: laziness. Directors and producers are both squarely to blame for this. For they have become lazy in two respects: first, they are lazy in that they ask “what shots do I want?” rather than “what shots can I do?” and build their cinematic universe around that. They brute-force it. Yet constraints breed creativity, as well as believability. Then there is the second, more fundamental laziness: it’s just so much extra work to actually make costumes and props and sets and get trailers for actors and run an entire production. It’s simply harder. Yes, it costs just as much, or even more, to film everything in front of a green screen and let post-production sort it out, but it sure is easier. It’s armchair filmmaking.
People keep waiting for CGI to get better. And assuredly, it is somewhat better than when it was first introduced. Perhaps the only way out is through. But as the capabilities of CGI grows, such that entire movies are now shot in front of green screens, so too does the problem of directorial reach exceeding grasp. Once just a replacement for props and animatronics, CGI has now swallowed up all of cinema, all backgrounds as well—our skies are not our skies anymore. And even if in 2040 CGI is so good as to be indistinguishable from reality (I doubt it), we will still have lost a half century of films to bad CGI when it needn’t have been this way. What’s even crazier is that professional film critics don’t seem to care at all—the concern is routinely dismissed as another grumpy fallacy of “way back in the days of yore” without any taking stock of how absurd the situation has become (and when the real fallacy is to assume that all change is good simply because it’s new).
Trends in cinematography are like fashion or architecture: almost impossible to see while you’re in them. We are always blind to our own age. Yet our era is dated very easily by the incredible unreality of its films, the fantasy of the whole thing, its emphasis on post-production and unreal colors and textures. It’s all just one long high-budget computer game cutscene. Your brain knows the difference, unconsciously, implicitly. And while we may have become accustomed to the digital slop we’re fed, the eyes of the future will pick it out, photon by photon, and judge.
Perhaps CGI is actually cost-cutting, if you factor in that the same bureaucratic bloat that has affected academia has likely affected movie productions as well.
I recently rewatched The Thing. Its practical effects are, of course, astounding, but they do show that even practical effects have their limits. Yes, it feels like the actors are really there, but it sometimes has the B-movie feel of the actors really being there around an unconvincing prop. Like with CGI, the effect is more convincing the less clearly we see it: the puppets / animatronics scurrying around in the dark and/or on fire are wonderful, but the autopsied Thing, just lying about on the table, is clearly a sculpture. But I share your preference: better the feeling that the characters are really actors on a stage manipulating props, than the even worse disenchantment that the actors are wandering around a soundstage.
As an addition bit of speculation, I wonder if part of the explanation has to do with scheduling. As Hollywood is wedded to the Blockbuster model, big budgets need big name actors, big name actors have busy schedules and are very expensive. So its simply more practical to work around their schedule by driving them out to a soundstage than to make more demands of their time by flying them out to location and having them potentially wait around for the practical effects to be assembled and placed.
It's true. Another thing is that mattes used to be painted in oil, and I think that medium is still the absolute high watermark in terms of capturing reality across sci-fi and fantasy. The people painting 70's star wars backgrounds just understood light and could replicate it better than modern CG artists.