Explaining CGI: Where reality and illusion blend

In this post, we’ll trace the history and evolution of CGI, its impact on modern filmmaking, and show examples of CGI work – both good and bad.
What is CGI? Image of a computer generated whale flying through a desert landscape.

If you’ve watched a movie in the last quarter-century or so, you’ve seen computer-generated imagery, or CGI, in action. As technology has evolved, filmmakers have increasingly relied on computers and software to tell more exciting and engaging stories through visual effects (VFX). The most common type of VFX is CGI, enabling filmmakers to create characters, animations, or even entire worlds using only computer software.

Early forms of CGI have been around since the late 1950s, but the technology really started coming into its own in the late 1970s, perhaps most notably with “Star Wars” in 1977. Audiences were blown away by the special effects, and the film’s stratospheric success offered a glimpse of what the technology would become in the following decades.

Today CGI is mainstream, and every one of the 10 highest-grossing movies of all time use it extensively. CGI has transformed filmmaking, transporting audiences to the edges of their imaginations and beyond with jaw-dropping visuals. However, it’s not without its detractors, who feel too much computer animation gets in the way of telling a story.

In this post, we’ll trace the history and evolution of CGI, its impact on modern filmmaking, and show examples of CGI work – both good and bad.

What is CGI?

CGI is the process by which a piece of visual media is created or altered using computer graphics. It’s a subcategory of visual effects (VFX), which is a broader term for any type of special effects added to a film during post-production. CGI has come to be used in everything from blockbuster movies and TV series to video games and even commercials. 

Simply put, CGI lets filmmakers make the impossible possible. It can be something relatively simple, like creating the effect that a live actor is flying. Or, it can be more complex, like an animated 3D character and the fully computer-rendered world they inhabit. CGI can be hyper-realistic, meant to trick the eye into believing something on the screen is really happening. Or, it can create something outrageous or fantastical completely from scratch, trusting the audience to suspend disbelief.

First usage of CGI

Even though the word “computer” is in the name, CGI actually predates modern computers by several decades. Starting in the 1950s, an influential film from each of the following decades would push the envelope on what CGI was capable of. 

The first live-action film to utilize CGI in any form was the 1958 psychological thriller “Vertigo,” directed by Alfred Hitchcock, but only in the opening credits. The film’s title sequence features a series of colorful, computer-rendered 2D swirls in the background, creating a disorienting effect. In 1968, Stanley Kubrick-directed “2001: A Space Odyssey” made use of CGI to display animated wireframe vector graphics on the computer screens of the ship during the movie itself. This marked another first for the nascent technology. 

The dystopian sci-fi/western “Westworld” from 1973 was the first movie to mix CGI and live action. The film created a “pixel” effect to show the audience the point of view of its android gunslingers using digitally processed images. The movie was written and directed by Michael Crichton, author of “Jurassic Park,” which itself raised the bar on CGI two decades later.

How Star Wars set new standards for CGI and filmmaking

Then in 1977, George Lucas and his groundbreaking sci-fi epic “Star Wars” would change the trajectory of filmmaking and special effects up to today. Most of the film’s stunning visuals were achieved through practical effects; matte painting backdrops, intricate hand-made miniature models of ships, and computer-controlled motion photography. 

Many top critics of the day were unimpressed with the story of Star Wars, but singled out Lucas’ attention to detail in creating VFX magic. Time magazine wrote in its somewhat backhanded review that the film was “a remarkable confection: a subliminal history of the movies, wrapped in a riveting tale of suspense and adventure, ornamented with some of the most ingenious special effects ever contrived for film.”

Achieving Lucas’ vision for Star Wars required special effects techniques that hadn’t been invented yet. In order to bring his ideas to life, he founded Industrial Light and Magic (ILM), a division of his production company Lucasfilm. ILM would go on to be at the vanguard of CGI for decades to come. After the blockbuster success of Star Wars, Lucas focused more of his and ILM’s efforts on computer graphics. In the years that followed, ILM created special effects for more than 300 films, according to Wired, including the Star Wars, Indiana Jones, and Jurassic Park franchises.

The first CGI character

At the dawn of the third millennium, the world was introduced to a hapless Gungan named Jar Jar Binks, and CGI would never be the same. Jar Jar was one of the main characters in 1999’s “Star Wars: Episode 1 – The Phantom Menace,” the long-anticipated prequel to the original Star Wars movies. 

Portrayed by actor Ahmed Best using advanced motion capture technology pioneered by Lucas and ILM, Jar Jar was the first main character in a movie to be entirely computer generated. But in 1999, the technology wasn’t quite there yet to achieve Lucas’ lofty SFX aspirations. Jar Jar represented a generational leap forward for CGI, but the execution made it difficult for audiences to connect with him. Pioneering status aside, he’s considered one of the most divisive characters in Star Wars history.

CGI’s impact on filmmaking

Love it or hate it, CGI has had a significant and lasting impact on filmmaking. It has become a key tool for directors, both for creating unique characters and richer, more engaging experiences for audiences. As its matured, CGI has shaken off most of its Jar Jar Binks awkward phase. Gollum from “Lord of the Rings,” Caesar from “Planet of the Apes,” and Thanos from the “Avengers” series are all widely beloved, 100% CGI characters. 

But movie fans are often divided on whether the proliferation of CGI has been a net-positive for the industry. Some believe it’s overused, or that it’s become a crutch to manipulate audiences with eye-pleasing visuals while telling flimsy, uninspired stories. Others feel CGI has ushered in the most exciting era for the movies since the advent of the blockbuster.

Both camps have valid reasoning behind their feelings, so it’s hard to say who’s right or wrong. Technological advances, just like movies themselves, tend to provoke strong opinions on both sides.

Why is CGI good for movies?

Based on box office numbers alone it’s obvious that CGI has legions of devoted fans who think it’s been a positive thing for movies. 

Cinematic-caliber computer effects are expensive to create, so CGI is often used for giving audiences a visually thrilling spectacle of some kind. But it’s also seen (or not seen) through subtle enhancements, like de-aging Robert De Niro in 2019’s decades-spanning mob epic “The Irishman.”

Practical effects, such as fire or an exploding car, are difficult and time-consuming to reproduce over multiple takes. With CGI, editors can make tweaks and changes post-production so the effect looks perfect in the finished product. 

In addition to enhancing practical effects, CGI also serves a big range of support functions. WIth it, animators can  alter backgrounds, or simulate environmental conditions like rain storms or snow. It also allows for a higher degree of engagement with the viewer, letting them see details like textures and lighting more clearly.

CGI also functions as an enormous creative sandbox for filmmakers to play around in, enabling them to tell stories in completely different ways. Big movie studios are locked in a permanent game of one-upmanship to show audiences something they’ve never seen before. That competition doesn’t come cheap, which often gives directors the budget and freedom to push the limits of film as a medium.

Why is CGI bad for movies?

For all of the praise CGI fans heap onto special effects-laden movies, a vocal minority of detractors feel filmmakers’ focus on effects in recent decades has had a negative effect.

Real life physics are difficult for computers to recreate realistically. Our brains might allow us to believe a person can jump hundreds of feet in the air, but if the landing doesn’t look real it can spoil our ability to suspend disbelief. It only takes a slight deviation from realism to spoil the illusion CGI helps create.

Excessive or poorly executed CGI can also take audiences out of the moment. Sometimes less is more, and that’s especially true in filmmaking. Throwing too many effects at an audience tends to overwhelm people, which lowers their overall impact. A commonly cited example of excessive CGI is in the Transformers movie series, helmed by the notoriously immoderate Michael Bay. 

CGI effects tend to look very much of their own era, and as a result often don’t age well. CGI-heavy movies from the early 2000s like “King Kong” (2005),” and the infamous “Catwoman” (2004) have some CGI sequences that look bafflingly bad with modern eyes. A trend-bucking example of this is 1999’s “The Matrix,” which pioneered the famous “bullet time” CGI effect. Not only did the effect take over Hollywood action movies for the next 10-15 years, it still looks cool as hell today.

As we previously mentioned, CGI can also be extremely expensive. Budgets for special effects-heavy movies in the modern era can go well into the $200-$250 million range. This sky-high pricing typically confines its use to creators or studios with sizable budgets. The CGI “arms race” between studios has also had the effect of ballooning movie budgets to levels some fear are unsustainable.

Iconic movies that used CGI

Over the last 30 years, movie audiences have been treated to an incredible assortment of memorable stories, many of them punctuated by the creativity and artistry of CGI. In that time we’ve been scared out of our minds, whisked off to lands beyond the stars, and brought along on awe-inspiring adventures.

Terminator 2: Judgment Day (1991)

Directed by James Cameron, who would later make CGI bar-raising movies like “Titanic” and “Avatar,” T2 owned the summer of 1991 with a true special effects extravaganza. The movie’s T-1000 android villain was made of liquid metal that allowed it to shapeshift, or even squeeze through metal bars. The film won the 1992 Academy Award for Best Sound Effects Editing, and legendary SFX artist Stan Winston took home the Oscar for Best Visual Effects.

Jurassic Park (1993)

“Jurassic Park,” directed by Steven Spielberg, wowed audiences with its photorealistic CGI dinosaurs, seen both at a distance and running aside the main characters in herds. The film immediately became the new gold standard for CGI, picking up three Oscars for visual effects and sound design, a BAFTA for Best Special Effects, and more than a dozen other awards. 

The Matrix (1999)

In “The Matrix,” directors the Wachowskis combined CGI with chroma key and wirework in innovative ways audiences had never seen before. Characters froze in mid-air, leapt off walls, and dodged bullets in slow motion in a now-famous CGI effect known as bullet time. The movie swept the 72nd Academy Awards in the special effects categories, winning 4 Oscars. 

Avatar (2009)

The release of “Avatar” was highly anticipated specifically because of its bleeding-edge CGI effects. The film immersed audiences in a lush, colorful world of bioluminescent plants and stunning alien life forms called Pandora. The film was approximately 60% CGI, with just 40% live action sequences. It took in nearly $3 billion at the box office globally, and remains the most financially successful film of all time.

Life of Pi (2012)

A relatively simple film compared to other CGI groundbreakers, “Life of Pi” tells the story of a young man and an adult Bengal tiger stranded together in a lifeboat at sea. The tiger, named Richard Parker, was entirely rendered in photorealistic CGI, which helped balloon the cost of production to $120 million. Like many of its CGI powerhouse predecessors, CGI swept the Academy Awards for special effects.

Guardians of the Galaxy (2014)

The fan-favorite action-comedy space romp “Guardians of the Galaxy” kicked off a successful franchise for Disney’s Marvel. It also puts even Avatar to shame when it comes to CGI. Ninety percent of the movie uses CGI in some capacity or another, with a total of 2,750 separate shots. “Jurassic Park,” by comparison, used just 63.

Avengers: Endgame (2019)

Marvel Studios outdid itself once again with 2019’s “Avengers: Endgame,” one of the most expensive movies ever made, clocking in at an eye-watering $356 million. A big part of its production budget was the over 2,500 CGI shots (out of a total of 2,700.) Supervillain Thanos, who, despite being entirely rendered in CGI, managed to be evil and terrifying, while also lifelike.

How is CGI done?

CGI is created using a variety of specialized software which is purpose-built for creating things like 3D models, lighting effects, and textures. Those elements are created by visual artists using animation or compositing tools, and then combined in a realistic fashion to create the desired effect. Because a lot of CGI aims to mimic real life, doing CGI well requires an in-depth understanding of everything from physics to light.

Software used for CGI

Software is at the heart of CGI, but the right program for you depends on how you intend to use it. There are a plethora of options, ranging in price from free to more than $10,000. A few of the most popular CGI programs and software includes: 

After Effects

After Effects is Adobe’s enduring digital visual effects, compositing and motion graphics application, which is often used for creating CGI effects in movies, TV, and video games. Like many Adobe products, After Effects is a powerful, feature-rich piece of software which requires some practice. Most beginners can explore and fumble their way through, but it takes time to become proficient.

Autodesk Maya

Usually shortened to just Maya, this user-friendly software lets you create incredible effects, complex characters and realistic environments. Maya is on the higher end of the CGI software spectrum, geared more towards advanced users. It’s best if you have at least some background in 3D modeling before diving in.

Blender

Blender is a free, open-source suite of 3D CGI tools offering advanced features and well-suited for both beginners and seasoned professionals. Its interface is on the complex side, which may take some getting used to, but there are ample resources available to help you get started.

Cinema 4D

Professional caliber 3D modeling, simulation, animation and rendering software, Cinema 4D is particularly great at motion graphics and is used by many visual media professionals. Compared to other CGI software programs, Cinema 4D is frequently praised for its gentle learning curve and high user-friendliness. It will still take some practice, but it’s probably the most beginner-friendly tool on our list.

Houdini

Offering a robust CGI toolset for both beginners and experts, Houdini is best known for its dynamic simulations and powerful procedural workflows. It’s widely used in the VFX industry. Some say Houdini has a steep learning curve, but it has a helpful community around it and there are a lot of self-guided learning materials and tutorials out there.

CGI’s undeniable mark on the film industry

Ultimately, there’s a wide range of opinions on the use of CGI in movies today. Some think it’s the greatest thing since the talkie, while others only focus on the less than stellar examples of this technology such as the Jar Jars and the infamous “Air Force One” plane crash scene. But whether you’re for it or against it, it’s undeniable that CGI has made its mark on filmmaking.

In 70 years, CGI’s capabilities have grown from a barely noticeable animation in a credit sequence to creating fully fleshed out, immersive worlds from thin air. Every genre of film, across all age groups, have gotten the CGI treatment, and a healthy percent of them have earned wildly successful box office returns.

Sometimes telling a good story takes the right blend of the real and the fantastical. In the right hands, CGI can help strike that balance, keeping audiences on the edge of their seat without distracting or overwhelming them.