Thursday, November 30, 2006

Graphics and the importance of lasting design

So we're officially in the middle of the next-gen wars, and one of the battlefronts is the realm of graphics. The transition to HD is in full swing, and the A/V lingo is flying like watermelons at a trebuchet conference. HDMI. 1020p. 480i. Composite vs component. Upscaling, downscaling and resampling. Sony fans smugly say that they're the only ones with true HD, and then find out the PS3 does a shitty automatic upscaling job. Wii fans angrily defend the top resolution of 480i, but don't even get that when Nintendo doesn't include component cables.

Why does it matter?

I mean, I don't want this to turn into another "gameplay is all that matters" argument, but I honestly don't understand the total reliance on graphics. I've seen multiple people say that not using component cables "makes everything look like shit," and a recent episode of the 1up show had the statement "Zelda makes me hate the Wii for not being able to have better graphics." I'm playing Zelda with composite cables fed into an RF adapter into a 12 inch Quasar TV, and I think the graphics are fine. So long as the gameplay is great, the graphics don't matter.

Except that the last statement isn't true. Graphics do matter, but I believe it's not exactly a general "older they are, the more the game suffers" way. Go back and play some PS1 games. This was back when 3d models could actually be modelled in real time (even if they did consist of 20 polygons). For the most part, these games look AWFUL now. Look at Vagrant Story, or Tekken 3. Yeesh, even FF7 looks really really bad. According to Gamerankings, these rank in the top 10 PS games ever. They were considered amazing games, but now they're barely playable.

But this doesn't hold true for many games that are even older. 3 in Three was made over 15 years ago, and it still looks great. Super Mario World was an SNES launch game, and you barely notice the sprite graphics. Why the difference?

I think that the answer is the same thing that happened to the movie industry. Special effects are at such a high point right now, that anything that hasn't taken 100 hours on a render farm seems dated and laughable. Monty Python used to use a blue screen for some of their skits, back in the late 60's. I bet at the time it seemed incredible - "wow, it's like they're in Gilliam's drawings!" But now, all you see is the thick blue line surrounding them, and it seems amaturish. I could probably do a better job with a digital camera, a sheet and 2 hours in Premeire. But if you watch the Jungle Book (which came out around the same time) again, it's still a great movie, even if technically it can't compare at all to Cars or Shrek. The medium of production changes its accessibility.

Scott McCloud touches on this in Understanding Comics (I'm referencing movies, games and comics - a geek trifecta!) Art in general can span from iconic (or cartoony) to photorealistic (he also adds another vertex of abstraction to make it a triangle, but I'm only focusing on this axis). As an example, Peanuts is drawn in a very iconic style, while Mary Worth is much more realistic. McCloud postulates that one of the interesting effects of the position on the line is how much the reader gets drawn into the work - more iconic faces (like a simple smiley face) are much easier to identify with, and thus more accessible. But more realistic drawings start triggering the uncanny valley effect, and you notice the differences more than the similarities.

You can see where I'm going with this. Cartoony, 2D games have a much longer lifetime than the "realistic" 3D games that came out for the PS1. I say one of the main reasons for this is the iconicness of the design. Luke Smith on 1Up said he thought that Wind Waker would actually last longer than Twilight Princess since it had cartoony cel shading instead of detailed 3d models. I think he actually might be right.

To tie this back into the beginning, I understand the point that realistic games need to constantly push better and better graphics, otherwise they'll be left behind. But I think that there's another branch of design entirely that can remain timeless, no matter what resolution you view it in.

1 comment:

Anonymous said...

1) 480p > 480i. The Wii can do 480p, but only with component cables.

2) Not using component cables does not make everything look like shit on your TV because, honestly, you wouldn't know better on that set. On the other hand, on any HDTV I've seen, the low-res interlaced signal from 480i is disgusting compared to even the small bump you get from 480p over component. Feel free to come on over here (long flight I know) and see what happens when I boot up a PS2 game - I'll even dig out some composite cables to simulate a Wii (I haven't managed to snag one yet). Then you can see what the 360 looks like on there. And yes, HDTVs are expensive now, but HD is where it's going. Nintendo is ignoring this at their own peril. Honestly, I wouldn't mind if all they did was add 720p output, even if the game looked the same. I just want that high resolution.

3) It's not "total reliance on graphics". It's that a bad-looking game looks bad. Given a choice between two equally fun games, I'd rather play the one that looks good. Games that focus on graphics at the expense of fun don't do well. See Dead or Alive 3 and 4. Wheras games that have focused on graphics but also focused on fun do very well. See Gears of War.

4) You're totally right about PS1 games. I couldn't stand the way they looked when they were new, and they certainly haven't aged well.

5) I agree that stylized graphics help, but I think you're looking at the effect of maturing technology. I think that Atari and NES games looked terrible (even though they were fun) but by the SNES, sprite graphics had matured to the point where people knew how to use them and had the hardware to do it with. Look at where those same 2D sprite games have gone with things like Disgaea or many DS games - they look great because sprite graphics are a mature technology. The 3D of the PS1 days, even through most of the PS2's lifetime, was immature. But by the end of the last generation, 3D graphics (thanks largely to developers getting a better handle on the programmable pipeline) have matured. I don't think people are going to look back at Gears of War and say "ick, that looks terrible". Sure, there will be much better graphics available, but even though GoW didn't go the simplistic route, it's working with mature technology, and it looks good.


To respond more generally to your point, I am upset that (largely from Nintendo fanboys) a correlation has been drawn between graphics and fun. You can have fun, good-looking games, and you can have fun bad-looking games. But it's not like making fun games is a radical departure, and it's not like having graphical capabilities that are not up to par with other systems of its generation is a good mark for your console of choice.