Gamasutra is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.


Gamasutra: The Art & Business of Making Gamesspacer
Planet of Sound: Talking Art, Noise, and Games with EA's Robi Kauker
View All     RSS
April 1, 2020
arrowPress Releases
April 1, 2020
Games Press
View All     RSS







If you enjoy reading this site, you might also want to check out these UBM Tech sites:


 

Planet of Sound: Talking Art, Noise, and Games with EA's Robi Kauker


April 13, 2009 Article Start Previous Page 3 of 5 Next
 

How does Max/MSP work into your production process?

RK: The MSP stage came out when I was still in graduate school, and it was the first step to digital processing without a big expensive system. So going back again to that creative sandbox that Mills kind of fostered for us, that's where we started playing with it.

Now, in The Sims world, in the Spore world as well, because Kent Jolly should get all the credit for kicking this off in both worlds -- we took these problems that we were having, recorded voice. The Sims voice-recording problem is this; we have 12,000 to 15,000 animations that need to have dialogue recorded in. That dialogue cannot be repetitive directly and it needs to be emotional, so we need voice actors. The other problem is we can't tell the player what the game's about, because our players define the game.

So that's why Simlish comes into being. A standard digital audio work session takes a minute to load up a video, name a track, get it setup for record, you hit record, you record a three second video. You record five variations of that. OK, you spent 15 seconds recording and a minute setting it up. We looked at this problem, and we went "Yuck."

So, we solved it using Max/MSP to build apps. We built a recording app, we built an editing app, that's a companion that scripts drive. It's very, very fast, it's very efficient, and it takes advantage of all the techniques that Max/MSP lays out for you. It's fairly simple. Jitter was the video component of it. That was the key that enabled us to do the tools that we did.

So it makes our recording pipeline very fast, very efficient, and very customizable. So what we did, Tiger Woods PGA Tour took and modified. They literally took out a huge chunk of what we did, and put in a simpler recording model.

The fundamental problem remained the same, was the way to quickly record through a script, without huge issues of using an alien interface that's designed for general purpose.

In The Sims 2, the voice of the robot, who is essentially a character of the game, is generated with a plug in I wrote for Max/MSP, a VST audio unit. Nothing fancy, not rocket science. But it was something I could make that could be used in a VST or AU audio chain.

We use it all the time now to mock up prototypes any sort of game scenario we want. We have USB game controllers mapped to it and then we go to the set up any time we need a prototype.


EA/Maxis' The Sims 3

Does that work in the runtime or do you have a different system?

RK: It can work in the runtime. I don't do that in The Sims world. I know some of the other teams are mapping it into runtime because it becomes a great mixer for it. You can lay out your custom mixer, your virtual layout of the mixer through an Ethernet cable or however you like to work.

I tend to build a very simple implementation pipeline, so those tools become kind of overkill for The Sims world. And by simple, it is very complex on the engineering side but not on the implementation or the content side. That's what I see as the conundrum of video games. Where do you put the complexity -- do you put it on the designer or do you put it on the engine? Well, engineers are usually smarter than designers. [Laughs]

We use it for prototyping a whole lot. The new MySims character dialogue is a simulation, actually. We made a plug-in, which simulates what we do in game that is a pitch-shifting algorithm. And we made that into a VST/AU situation for prototyping and demoing -- every actor's voice that we work with has to also work in this very weird pitch-shifting algorithm. We have great actors that I'd love to work with, who audition for us and we are not able use them because their voice breaks up in this highly efficient pitch-shifting algorithm. We have to know that quickly. We can't wait to get that content in game. So we've modeled it with this pitch-shifting algorithm.

And that stuff pays off for me not just in game development on the front end when I'm auditioning talent but also when I'm doing marketing later on. I have all this content that needs to be matched up for marketing and that marketing may not be done by us, it may be done by an advertising agency.

If we have external developers working on websites or something like that, we give them this plug-in. Here's the sheet, and these numbers correspond to these voices in the game. So that pitch shifting has actually become a character in the game. We have eight actors in that game but we have forty different character voices. Just that simple bit of technology pays off huge dividends for us.


Article Start Previous Page 3 of 5 Next

Related Jobs

Infinity Ward
Infinity Ward — Woodland Hills, California, United States
[03.30.20]

Senior Audio Engineer





Loading Comments

loader image