Postmortem: Q-Games' Pixeljunk 4am
April 3, 2013 Page 1 of 4
A reprint from the September 2012 issue of Gamasutra's sister publication Game Developer magazine, this article is a postmortem of Q-Games' IGF-nominated interactive music title, PixelJunk 4AM [YouTube trailer]. It includes new commentary from the developers especially for this Gamasutra version.
You can subscribe to the print or digital edition at GDMag's subscription page, download the Game Developer iOS app to subscribe or buy individual issues from your iOS device, or purchase individual digital issues from our store.
At the end of 2010, right after finishing PixelJunk Shooter 2, Q-Games president and founder Dylan Cuthbert pulls me aside for a chat.
"So we've kind of got this music visualizer using the PlayStation Move called lifelike on the back burner," he says. "You should make it happen."
I asked for some more details about the project. "Well, there's music," Dylan said, "And there's a PlayStation Move. Off you go."
PixelJunk 4am released in spring 2012 on PSN. It's not so much a game in the strictest sense of the word -- it's a Move-exclusive audiovisual composer, where all your performances are broadcast live around the world on PSN.
Music is created using the Virtual Audio Canvas, which is an actual physical 3D space carved out in front of the player. It contains more than 190 sound samples, a wide variety of DSP effects, and the ability to dub loops into your own unique groove. We also released a free Live Viewer, allowing anyone on PSN to stream performances live and give real-time feedback.
It's easy for a game designer to say, "Hmm, this boss is still a little easy. It should breathe more magma!" Knowing when a 4am event is finished, on the other hand, starts venturing into the realm of music production -- and there were no other similar games to use as points of reference. The control scheme is completely unique to 4am, and the experimental social gameplay we included also seemed pretty far-fetched at the time when we were developing.
Normally, those are the points at which someone high up usually says, "Stop smoking so much and make something solid!" Yet throughout development, Dylan supported every new crazy idea by saying, "If it's fun, put it in!" Our U.S. publisher Sony Santa Monica was also super supportive despite the amount of wild new design that 4am was pushing. The faith (and massive balls of steel!) of these fine people is what ultimately allowed us to release a unique experience that will hopefully be remembered fondly for a long time by its players.
What Went Right
1. Fearless Hardware Experimentation
Going into the project, I didn't exactly have a glowing perception of the Move -- it had always felt gimmicky to me. On the first day, though, we played around with some of the SDK samples and were surprised to discover that the Move actually seemed pretty robust and ripe for creating some wild stuff with tricks that neither the Kinect nor the Wii could do. (It occurred to me at the time that it might not be the Move's fault that it wasn't being used for crazy new stuff, but that people weren't making games for it to do crazy new stuff; since we started on 4am, we've seen some other notable Move experiments, which warms my heart greatly!).
With a skeleton team of two programmers and one designer, we prototyped at least 12 different control methods for 4am, all utilizing the Move to control music in space. Some of the control schemes varied from casting musical "spells" in the air to replicating an eight-way arcade stick and inputting Street Fighter commands in the air.
Regardless of how ludicrous each idea seemed, though, the only test that mattered was whether we could ask, "Can I make music, and is it fun?" and honestly answer yes. Dylan supported our willingness to experiment without regard for whether something felt "normal" or "gimmicky," all the way up until we found 4am's distinctive Virtual Audio Canvas. Without that support to continue experimenting, we might never have pushed the Move enough to find the Virtual Audio Canvas we have now. It would have been a hell of a lot easier to just implement a menu and pointers, but it wouldn't have been 4am!
2. Discarding Old Tools
When I picked up the lifelike project in January, there were already legacy tools and an editor. Unfortunately, this editor was largely unusable (I'll explain why later) and we needed to make a quick decision regarding how to move forward. It was primarily developed and tested on a PC, and the PS3 version wasn't optimized.
GameMonkey (our scripting language) was running computationally intensive generic behaviors with a vast array of tweakable properties. The framework gave us flexibility, but came at the cost of unoptimized generic code that was incurring a lot of processing time. Even though the artists and designers were only able to build rather generic-looking visualizers, our frame rate was still suffering.
Shortly after the controls prototype was finished, we scrapped all the generic behaviors and started rewriting the rendering code with specific visuals in mind. It gave us the freedom to optimize each visualizer independently (scalpel over sledgehammer) and expose only the most interesting elements to the artists. Ditching support for the PC build also let us offload a lot onto the PS3's SPUs, and freed our lone visualizer programmer from the burden of maintaining both versions of the editor. This boosted our frame rate and made the artists feel more empowered to make cool stuff.
We also sat the visualizer programmer right by the relevant artists, in order to quickly iterate the visualizer as they figured out what the artists needed to achieve the look they wanted. This paired-nucleus approach was definitely smoother than a programmer trying to write blanket variables to achieve an invisible goal.
Page 1 of 4