[In this Intel-sponsored thought piece, company veteran Roger Chandler looks at why trudging through the Uncanny Valley to create realistic interactions with virtual creatures may lead to upsides in AI, special effects, physics, and more.]
One night several months ago, my buddies and
I logged on to play a newly released, online
role-playing game. The environments were lush,
the details were rich, and the monsters were stunningly
rendered. I was quickly sucked into it all...until I ran into
a small log lying in front of me. It literally stopped me in
my tracks.
I know a lot of great game developers and
designers, and I understand the difficult design decisions
they must make when bringing a title to market, but
this log really surprised me. It was richly textured and
accurately modeled, which was nice, but it "behaved" like
a brick wall.
Despite being a powerful warrior who had
just slain three ogres single-handedly, I could not raise
my in-game foot 18 virtual inches to pass over this small
log. So I walked around it and we continued on. But that
experience stuck with me.
In 1970 roboticist Masahiro Mori introduced the
Uncanny Valley hypothesis: as robots and other
representations of humans begin to look and act
almost, but not entirely, like actual humans, it causes
a response of revulsion among human observers.
The
"valley" refers to a precipitous drop in the viewers'
positive response to the near-human entity as it gets
closer to realism, due to the dramatically increased
expectations around behavior and other subtle
human-like details.
The Uncanny Valley has drawn
much attention from the gaming industry in recent
years, and while it definitely applies to human-like
characters, I think the effect extends to
the environment as a whole.
As a first-hand
witness to our industry's ongoing graphics
arms race, I believe it takes more than
beautiful scenery to engage a gamer. We live
in a world where every object has distinct
physical properties and every creature
has unique behavioral characteristics.
As
developers continue to improve how these
objects and creatures look in-game, they
must also meet the players' heightened
expectations regarding how these objects
and creatures will act.
For example, I do not
expect a blocky, pixelated tree to sway in the
wind or splinter realistically when I blow it to
bits with a rocket launcher. But if that tree
looks nearly identical to the one in my front
yard, then it will be a noticeable distraction if
it does not act like the real thing.
One of the things I love about my job is
hearing developers' thoughts on how they can
create new levels of interactive realism with
the increasingly powerful and programmable
multi-core processors Intel has on its roadmap.
Certainly developers are doing great things
with today's generation of multi-core
processors to ensure games look and act more
real. Here are a few of their efforts:
- Threading to improve overall framerate
- Accelerating asset loading to make scene
transitions more seamless
- Utilizing procedural content generation
to ease the burden on the artists and to
dynamically populate vast worlds with
rich environments
- Applying particle effects for more realistic
smoke, fire, and weather systems
- Enhancing artificial intelligence for
in-game characters
- Improving game physics to ensure
objects interact with each other
and, more importantly, blow up more
realistically