We Have Moved

Pixel-Love is now at: pixellove.wordpress.com

A copy of the content remains here so links don't break, but everything is ported, so if you want to look through the archives we suggest you head over to the new site.

Thursday, June 02, 2005

Reality Bites

DOOM 3 and Half-Life 2 have sat on shelves for six months or so, and we're becoming increasingly complacent with the enormous leaps in graphic technology that they represent. Game worlds are getting closer and closer to matching reality. This process is speeding up. The advent of Ageia's new physics accelerator, 512-megabyte video cards, and dual cores will give home computers the kind of awesome math power necessary to render a completely real-looking scene on the fly. The Holy Grail – game worlds that are indistinguishable from the real one – is finally in sight, but with it has come a peculiar problem. Since the rise of video gaming, all the power of the evolving medium has been focused on making games more and more “realistic.” But there's an obvious barrier there: once they match the real world, there's nowhere else to go.

One day we'll be able to plug ourselves into games that subtract us utterly from this reality and insert us in a new one. All five senses will be stimulated, and it will be like we're really there. For now, though, by “totally real” we just mean that it'll seem real on our computers – visually, physically, and aurally. True VR is a long way off; photorealism isn't. What will developers do when graphics and physics literally can't get any better, because the brain has reached the limit of its ability to perceive?

The answer is that games will have to innovate in different ways, and so far the business is innovating largely in technology. The new Xbox will utilize the power of HDTV. The Cell processor is so terrifyingly advanced as to be nigh-sentient. nVidia and ATI are both committed to dual-GPU graphics. What once seemed like a really great thing on closer inspection represents a problem for developers. The smart move is to rally around advanced design initiatives and more impactful games, because nothing listed above will actually make games better.

In fact, they threaten to make them worse. Developers use advanced technology as a crutch to obfuscate the fact that games are, frankly, devolving. Gameplay is suffering. Innovation is frowned upon, and copycatting encouraged. We're going to see a thousand Gravity Guns this year, but we still haven't seen the game that imparts true emotional impact. In fact, the closest we've come is the occasional game that scares the bejesus out of us, and fear is a comparatively easy emotion to evoke. Once technology reaches the limits of the current delivery mechanism – and it's getting there, fast – games can only get better by more expertly affecting the player. Using fancy new graphics as a selling point will be simply impossible when all games are photorealistic, and that will cause a big problem.

Warren Spector, Brenda Laurel and Greg Costikyan all touched on this during the IGDA's Burning Down the House rant at GDC. Each said that improving the game experience is key to the long-term growth of the medium. And they all seemed to agree that advancing technology – HD, Cell, whatever – can't make games better on their own. Chris Hecker reinforced the designers' arguments by pointing out that these new technologies will be mediocre performers when it comes to AI, reactive logic and similar features that actually would make games more immersive. Meanwhile, Anand Lal Shimpi noted in a CPU column that these advances, gonzo as they may be, are going to be hard to work with.

The new tech is weirdly retro, too. nVidia apparently learned nothing from 3Dfx; they honestly seem to believe PC gamers will shell out $700+ for multiple video cards. Cell is revolutionary and fascinating, but it's basically an Amiga on steroids. HDTV is… TV. Perhaps this is an early symptom of the wall that technology is about to hit. Where is the coprocessor that enhances a game's decision-making prowess, so it can build unique worlds on the fly or react correctly when a player does something unexpected? If we intend to blindly marry ourselves to ongoing technological advancements to the detriment of rich experiences, at least focus on advances that will make games play better rather than just look better.

We are seeing the odd forward-thinking game here and there. Darwinia, Katamari Damacy, Ico and so on are good starts. They're also solid proof that innovative games can be big sellers. Narrative is on the upswing in some games as well. But on the whole, technology still dominates the market. The trick is to merge good tech and good design rather than hiding gameplay mediocrity behind normal maps. The Elder Scrolls: Oblivion might serve as an ambassador for this technique, assuming it's more Morrowind than Daggerfall .

This isn't a “technology sucks” argument. I love pretty graphics and neato physics as much as the next gamer, and I like to think that I'm not one of the people to whom Warren Spector was referring when he said that certain scholars have no conception of industry realities. I get that improvements to technology and realism are better selling points than innovative gameplay. When STALKER: The Shadow of Chernobyl finally ships in 2019, the wise decision will be to market it based on its kickass technology, not its (ostensibly) innovative gameplay. Developers, publishers, no one is “to blame” for this problem; the business is doing what sells. And yet a change does need to be made, because that angle is running out of steam.

A line from The Incredibles is telling: “once everyone is special, no one will be.” The games industry needs to be concerned about this. We live in a world where off the shelf chess programs consistently school grand masters – imagine what computers will be capable of in six or seven years. Photorealism won't arrive tomorrow, but it will arrive, and rather than being the triumph we all originally thought it would be, it threatens to become a major creative albatross. Matthew Sakey

1 comment:

Josh said...

I may have to do a follow up on this. Well argued. I really think it's not the technology that's a problem ... but the speed of the technology. We don't let our genres cool off and evolve quick enough before we shove so much new data into them. I'm working in 2D after playing with 3D for years and I've had more ideas now than ever...