Are Video Games Art?

Ninja Gaiden
A still from Ninja Gaiden: The Dark Sword of Chaos, a pioneer in the field of video game cutscenes.

The late Roger Ebert set off a major controversy a few years back when he opined that video games were not art. Ebert’s argument is multifaceted and intellectual – he makes a well-deserved attack on Kellee Santiago’s pretentious TED talk that presents the Lascaux cave paintings as mere precursors to art made in subsequent millennia rather than unique masterpieces in their own right. The payload:

One obvious difference between art and games is that you can win a game. It has rules, points, objectives, and an outcome. Santiago might cite a immersive game without points or rules, but I would say then it ceases to be a game and becomes a representation of a story, a novel, a play, dance, a film. Those are things you cannot win; you can only experience them.”

Cutscenes and visual novels
Granted, video games have gradually moved toward being experiences, starting with perhaps as early as the oddly bloody (for straight-laced Nintendo) cutscenes of 1989’s Ninja Gaiden: The Dark Sword of Chaos. Lengthy animations, mini movies, tons of dialogue – technological advances have made it possible for video games to be become, at least on the surface, more like accepted art such as cinema. And this mainstream progression, from Ninja Gaiden to Metal Gear Solid to Crysis, barely even touches upon genres such as the visual novel, which have flourished in Japan and on the (3)DS platforms, or the FMV games like Gabriel Knight: The Beast Within from the early CD era, when developer did everything they could to take advantage of the newly expansive medium.

These games have far less emphasis on “winning” than on simply seeing the story unfold. The final case of Ace Attorney: Justice for All contains one of the best-scripted, heart-wringing climaxes of any story, video game or not. It makes you loathe the villain and experience the relationship between the characters, and in this sense I think it’s closer to “art” than any realistic pseudo-movie, since it makes the player almost secondary to proceedings. You are, in a way, just some guys moving things along, as if you were clicking through someone else’s elaborate PowerPoint.

The problem of technological obsolescence
Ebert’s argument could have been even stronger, however. Video games have a unique weakness: they are dependent upon certain types of proprietary hardware.

How much will the first few Assassin’s Creed games matter in 20 years, when no one has access to the original consoles that can play them? Even with emulation, the experience isn’t the same. I wrote in my previous entry about playing Battletoads on an NES emulator. While doing so was a basic necessity – neither I nor anyone I knew at the time had both the game cartridge and a working NES console -, it changed how I played the game on a basic level. I could use save states that by merely existing softened the original’s unforgiving difficulty. I had to use different input methods (mainly keyboard).

The tons of classic NES and SNES games are nowadays mostly playable only via emulation. Imagine if you could only watch The Thief of Baghdad or The Birth of a Nation by “emulating” (or actually using!) an early 20th century era projector and screen. Of course, that isn’t the case – you can watch either one on an device that has Netflix on it. Similarly, imagine if the works of Shakespeare could only be read on 17th century folio paper and were essentially illegible on anything printed after that time. Such a reality would be absurd, but it’s basically the issue that plagues video games: their greatness, with precious few exceptions, isn’t transferrable across eras.

No one will care about Call of Duty – or even be able to play it, without a techno-geek’s setup – in 50 years. Angry Birds will likely become a relic of an era of limited smartphone hardware, no more remembered that those aim-the-canon games from early 90s PCs. Sure, you might say – no one can use an ancient Roman drinking vessel anymore and must just look at it in a museum. But using it was never part of what made it “art” – its design was the key trait. When video games lose the interactivity component over time (due to technological change), they more or less cease to exist.

These same technological constraints – call it the Video Game Disease – are what also make me fear for the posterity of ebooks and exclusive downloadable/streaming content. Making everything proprietary and tied to particular hardware or software is a great way to make money, but it demonstrates how the “digital age” (whatever that means), as it currently exists, is not amenable to long-lasting art.

Of course, none of this makes playing video games any less enjoyable. But it makes it hard to enter them into the “art” conversation. What I wonder, though, is why it’s so important that games be considered on par with cinema, rather than just enjoyed in their own right for the unique pleasures that they bring.

Blog at

%d bloggers like this: