We Have Moved

Pixel-Love is now at: pixellove.wordpress.com

A copy of the content remains here so links don't break, but everything is ported, so if you want to look through the archives we suggest you head over to the new site.

Wednesday, October 04, 2006

LGF Day 3: Lucas Arts talks Next Gen

LucasArts project lead Chris Williams has been telling Next-Gen.Biz about the firm’s plans for developing next generation iterations of Indiana Jones and Star Wars, as well as the move towards simulation-based gameplay…

Williams says, “It’s disappointing to see the number of next generation games out there that are realty current generation games with better graphics. We are committed to next gen gameplay, to experiences that are different. The tent poles of that are doing innovative things with character and with story.”
He adds, “Simulation based gameplay is challenging some game conventions and systems that are very canned, routine and scripted and replacing them with advanced technology that allows them to be simulated.”

There are three pillars to this plan, best summed up by the company’ relationship with there entities – internally with Industrial Light and Magic and externally with character AI specialist NaturalMotion and with physics 'digital matter' AI house Pixelux…

He demoed Indy, and showed a fight with some San Francisco hoods repeatedly in real-time, where, indeed, characters react differently each time, depending on how hard they are hit, where they are hit and what they bump into. The graphics are still shaky, but it's the behavior that's important at this point, merging animation with AI-based combat sequences.

He also showed a demo of Indy on a rope bridge (mentioned here later) that is being shaken in real-time. Indy tries to keep his balance as the bridge moves in random ways. There are no animation frames involved. Indy looks stiff at the point, but it's impressive to see how the character reacts based on 'taught behavior' instead of animaton-based scripted routines.

Next there was a Star Wars demo showing R2D2 being thrown against a wood panel. The wood breaks and splinters in different ways according to how the droid hits the panel. Likewise a glass panel and a compound object made of stone and crystal.

Finally we see Jar-Jar carbonized. Splendid.

Next-Gen: What is LucasArts getting out of the new closer relationship with Industrial Light and Magic

“We are building our entire next generation game pipeline on top of ILM’s toolset. They have a proprietary framework that they use for their films, so we are building our game editor and tools on top of that as well. It’s a massive engineering effort.

“The right way to develop for us, is working back from what seems impossible and figuring out how to do it on next generation systems, rather than an incremental way of saying ‘this was how we did it on previous systems so let’s build on that’.

“ILM’s knowledge base means we can try to do things in preposterously advanced ways, for example the way that water is simulated or lighting techniques that have been pioneered by ILM. Although they can’t be done in real-time even on next gen systems, the fact that we are working hand-in-hand with them means we can solve the problem of finding that sweet spot where we are scaling back what they have pioneered so it works on the hardware.

“It’s a great way to do R&D. All ILM’s problems are driven by single projects, so they need to solve individual problems. But they never just develop technology for the sake of it. It’s a great model to follow for game development. Companies have got in trouble over the years by having a core technology group that just goes off and build engines without a specific application in mind. We’re developing technology with the specific goal of servicing our next generation Star Wars and Indiana Jones games."

Next-Gen: Tell us about NaturalMotion and your move away from animation-driven character development, towards simulation and physics?

“NaturalMotion are experts in human character behaviour. We partnered with them because we shared a common vision of biomechanical AI; authoring characters through behaviour rather than through pre-set animations. We have built up a system that works with our animation and game AI.

“If I’m a consumer why do I care? What is the benefit? We are simulating character responses to stimuli. They have awareness and adapt on the fly to their surroundings. So every time you punch someone or throw someone you will get a different result.

“The days of the limp rag doll flopping round are largely over. What we have are characters that will react as you’d expect when they are punched in the gut but as they go crashing into other element in the game, they react in an entire sequence of behaviours that we have taught that character instead of through a library of animations."

Next-Gen: Does it work?

“We have it running in our Indiana Jones demo. We are not totally satisfied with the quality, we know some of it is quite rough but we also know that the underlying technology works.

“We have Indiana Jones on a rope bridge and the player controls the bridge not the character. We wouldn’t do that in the game but it shows how you can jostle him around and how he reacts. He tries to keep his balance and he hangs on. There is no animation. It’s entirely behaviour based.

"It’s a digital character in real time keeping his balance on a rope bridge. We see that as a quantum leap in game design. It will change the way people interact with games. You don’t feel like you are interacting with a system any more. When you play a combat game you know that this punch results in that animation. But in this, every time you grab a guy and throw him, the result is a surprise. Who knows what will happen? Games are a series of player choices and we are offering a series of great pay-offs that don’t keep repeating."

Next-Gen: So different characters react in different ways?

“We are not down to that level of detail yet. We are still on the core behaviours like protecting yourself, being able to keep your balance, being able to pick up things, but we are excited about that in the future. This makes the character seem smart and there is always something new and surprising."

Next-Gen: It must present some serious challenges in terms of overall game design…

"Previously you would enter a room and undertake a series of actions that would trigger a series of animation and you’d leave the room. Now, you enter the room and unpredictable things happen. We are not scripting. We are simulating.

"But we ultimately have to create a narrative so we see it as a series of expanded simulations that are then funnelled back down to take them into the next area. There is still an element of guiding the player’s experience but in that moment you are getting a lot of gameplay and a lot of fun per square inch. It’s no longer about completing an area to see what the game designer had planned in the next area."

Next-Gen: Can we talk specifically about the games?

"We are two years into this. A lot of the core technology has been sorted out. This isn’t just an idea we had. But we are not talking about release dates. However, the tech demos that have been shown give some clue to the style of the games. We have two next gen development teams working in parallel, which is nice because we are seeing a lot of areas where we can overlap but we are not giving too much away yet."

Next-Gen: Tell us about your work with Pixelux Entertainment.

"They have created something akin to digital molecular matter which is true material physics. Wood splinters, metal dents, rubber bends, ice shatters. All these materials are data driven and incredibly realistic.

"In current development when you want to break things you build a regular version of, say, a table. Then you build a broken version and so when something hits the table you swap in the broken version for the intact and you go to a rigid body simulation and a particle effect and that’s it. It’s highly predictable and repetitive. Breaking the first crate is the same as breaking the 500th crate.

"But breaking stuff is tremendous fun. We want that level of unpredictability and surprise that we’re bringing to characters. This technology means that every time you throw something or bang into something you’re getting an interesting result.

“It has massive implications for gameplay because you are building a truly simulated world. If we build things that aren’t structurally sound, they won’t hold. Our designers are becoming structural engineers. It’s very conducive to Indy because it allows you to be spontaneous and react to the things available in the environment.

“So with the simulated character and simulated environment together you get the core of a truly interesting game experience. The character learns to grab onto something when he falls, but that is also a simulation so he has to react to the fact that it’s breaking. It’s never the same thing twice, which I think sums up what we are trying to achieve.”

No comments: