Saturday, September 10, 2005

Simple Tools, Powerful Results

I second my wife's comments. I loved my old StarTac. Simple and elegant, it did the things I needed it to do, and did them well.

Many companies fail to appreciate the "less is more" rule when it comes to product design. I work in the software industry, which is perhaps more guilty of loading additional, marginal features into its products than any other. In fact it's such a common sin, Google is able to generate an amazing amount of press coverage by the mere fact it features less than 50 words on its homepage.

What the designers of Google and of the iPod understand is that their products are tools.

When we think of a tool, we think of a device that help us gain a mechanical or mental advantage in solving a problem. Use of a tool is a deliberate act; we select a tool appropriate for a given situation and apply it. The best tools are therefore ones that are obviously suited to a particular task and whoose performance can be predicted.

Most software and many electronic gadgets fail this test miserably.

Friday, September 09, 2005

Just a Phone, Please

Convergence is so over-rated.

I've been reading the buzz on Apple's most recent "BIG" announcement - the Motorola Rokr phone that plays iTunes.

Horrible idea. The whole reason I listen to music is to disconnect from my information-overloaded world. My fellow commuters wearing familiar white earbuds are probably thinking the same thing. For a brief 20-30 minutes while the train takes them to work, they get a little repose from electronic bombardment; not to mention enough distraction to make you forget you are being squished into a metal subway car like a sardine.

Why must my mobile phone play music, take pictures, play games, send email, or browse the web? What is so wrong with Just a phone!

While there certainly is a market for people who want the everything device - not me. I want my phone to be a good phone. I want good reception, a nice address book, a simple interface with buttons I don't have to sharpen my nails into a fine point to push.

I like my iPod because it is a music player and no more. I can change songs, change the volume, pause so easily I don't even need to look at it. It is an ergonomic pleasure.

My phone by contrast, is a convergence nightmare. It does not have a camera (I hunted high and low for this "feature" - in my professional environment, people do not take to kindly to handheld cameras; so if your phone has a camera you are asking to check it at the door of every building you walk into.) It does however have infinite possibilities to play games, ring tones, change the wallpaper, browse the internet, and who knows what else. I spent endless hours trying to configure this thing to hide all that and make it easy to do the only three things I want my phone to do - answer calls, look-up addresses, and check voicemail. These days, you can hardly find a phone that doesn't also do these other functions (this is why they are now "Mobile Devices").

But in honesty, all I want is my old Motorola StarTac back. It doesn't play Bananrama's greatest hits; it doesn't have to. It was just a really good mobile phone.

Thursday, September 08, 2005

Graphics and the Adventure Game

One of the genres that gets very little attention as a result of the hit-clustering mentality of the gaming industry today is the adventure. The adventure game has a very long pedigree and is closely aligned with an interesting offshoot of literature called interactive fiction. It also has a small group of fans that will buy virtually anything that's published in the field. I know; I'm one of them.

When I got my start in computer gaming, adventures were everywhere. There were text-based paser adventures, such as Zork and King's Quest. There were point-and-click adventures with verb-noun interfaces, such as Maniac Mansion and the Secret of Monkey Island. For a time, adventure games were the bread and butter for big publishers like Sierra and Lucasarts. What happened? Was Infocom eaten by a grue?

No, it was eaten by Activision, which later happened to secure the rights to make a sequel to a hot little property called Wolfenstein 3-D. Maybe you've heard of it.

You see, what happened to the computer game industry was a revolution. The first-person shooter phenomenon shook the industry to its roots. For a long time, playing a game on a computer was a secondary consideration. The primary purpose of a computer was to get work done, strange as that may seem now....

What happened was that first-person shooters were tremendously exciting. They put the player at the middle of the action, which was conducted in real-time. Not the turn-based scenarios of earlier games. It combined the immediacy of the console with the superior graphics of the personal computer.

Adventures suffered from a few drawbacks that made them very, very uncool. From an industry perspective, they were uncool because they didn't push the technology. The primary determinant of the quality of an adventure game was its story, characters, and puzzles. The primary determinant of the FPS was its immediacy and the player's immersion. That meant a whole series of new technologies were needed: graphics accelerator cards, sound cards, monitors, mice, etc. FPS was a killer app for a range of technologies.

FPSs also had an advantage in that for a long time it was easier to make a better game simply by making a better looking or better sounding game. This is quite different from adventures, where to this day, you can stir up trouble in message forums with a "which Monkey Island do you like best?" question. Adventure games rely on their storytelling, which puts a premium on the expensive, difficult to manage creative types. Developing an FPS requires a team of down to earth engineering types to focus on details like getting a virtual wooden crate to appear to fall toward the earth due to gravity.

Most of all, the adventure genre faded because they attempted to play the game on FPS turf. Though Grim Fandango was critically acclaimed, and used 3-D, from a gameplay point of view it was a step backwards. Rather than spending time on the puzzles or dialog, gamers spent most of their time driving their characters around, WASD style -- something that had nothing to do with the point of the game. It led to platform-itis, a dumbing down of the genre to be a mere hunt for collectible items in the game world. And how much fun is that, after the age of ten?

Well, quite a bit, given the success of the MMORPG. But while the experience is fun, it just doesn't have the depth us old-schoolers remember. So sure, I'll finish this post and grind out a few more levels in World of Warcraft, but secretly, I'll wish I could type "open mailbox" again into a stark, black screen.

Wednesday, September 07, 2005

Gaming History

My wife played a few of Sierra's Quest series games back in the day, as did I. Actually, we both still remember the heyday of the text adventure. (We didn't need no stinkin' graphics back then.) That's why I was surprised to learn that despite our shared love for adventure games, she'd never played any adventures by Lucasarts.

I know, I know, it's shocking.

So I determined that she needed to experience all these fantastic games, if only so she could understand some of the odd things I said at the oddest moments. Digging through my archives, I maanaged to find copies of Curse of Monkey Island and Escape from Monkey Island, both of which still run under Windows XP. We're midway through Escape now. But what of all the older Lucasarts gems? For that, I had to turn to ScummVM. I found out about it at Ron Gilbert's blog Grumpy Gamer. (Ron Gilbert was the mastermind behind the original SCUMM engine, which, in various incarnations, drove classics like Maniac Mansion, the Moneky Island series, several Indiana Jones games, and others.) Suddenly, this vast treasure trove of gaming goodness opened up for us.

This made me realize, that in addition to helping me relive my childhood, the ScummVM team, and the emulator crowd in general, are doing us a great favor when they port older games to new platforms. This is gaming history. These earlier games, crude though they may seem at times, helped to define the genres we know today. Ten, twenty, or one hundred years from now, people will think, "How did all this get started?" We've lost in the mists of antiquity the firsts of other great media revolutions. It's good to know that our computer, arcade, and console gaming history will not be lost as well.

Sunday, August 07, 2005

Enough with the Undead, already!

I realized that in my last posts about games, I'm talking almost exclusively about first person shooters (FPSs). I think my point applies to other genres as well, though. The role-playing game genre, for instance, also relies on the twin precepts of otherworldy creatures with very little upstairs. Subsitute kobolds or orcs for zombies, and the BFG-9000 for a +2 Sword of Penultimate Butt-kicking, and my comments still apply.

One of the things Bioware and (now sadly departed) Black Isle Studios had to do to the D&D ruleset to make their games work on computers was to ratchet down the level of experience points (XP) gained for every creature killed. Why? Because they threw hordes of very stupid monsters at you. Rather than have a small bands of intelligent opponents, they opted for waves of enemies. It's simply easier to do. And hey, one orc looks just like another, right?

Now, I appreciate a good hack-and-slash RPG; my wife and I are girding ourselves to tackle Dungeon Siege 2 soon. It's just seems strange that even the less hack-y and slash-y entrants in the genre rely so heavily on combat with hordes of identical clones. With all the time game developers put into creating expansive skill trees, why is killing someone the default solution to every problem? When critics regularly abuse the same tired one-man-verus-an-army premise, why do game publishers keep pushing the same old story? It only makes sense if one takes the view that constructing a game in this way must be easy, while other avenues are more difficult.

There are, of course, pockets of innovation out there. I salute the courage of those game developers and publishers. I just wish there were more of them.

Sunday, July 31, 2005

Return of the Living Dead

I thought this would be a two-part series, but I realized I had more to say on the subject of the zombification of the gaming industry. Here are links to the previous posts regarding zombie graphics and zombie AI.

It's not that I have anything against zombies themselves. (Other than that those rotting, lumbering, infernal abominations are trying to harvest my mind, that is.) It's that we've seen game designers use these tricks before.

The hordes of bad guys used to be aliens or demons. It's a much easier task for the graphics and animation department. We don't expect to be able to read the emotions on the puss of a slobbering green monster. In fact, in many games, these creatures never change expression at all. Contrast that with how we'd react to a human-looking opponent whose face betrayed nothing but a perpetual Zen-like calm in the midst of horrific carnage.

We also cut aliens or monsters slack as well in the behavior department. We assume that the monster just isn't smart enough to realize that charging directly into the barrel of a BFG-9000 might not improve the chances of its survival. We expect aliens to follow their own, alien logic. We don't expect them to have human reactions. Besides, they're far from home and in unfamiliar territory. It might not occur to an otherwise highly advanced alien that standing next to a primitive barrel of flammable liquid is a bad idea.

In earlier generations of computer and console games, the excuse was that the technology simply wasn't capable of these feats of graphics and gameplay. To have returned to zombies, aliens and demons once again in the current era of gaming is simply laziness. Game designers have exploited these tired old story devices ad nauseam. We should demand more.

Thursday, July 28, 2005

Brains! We must have brains!

As discussed in my previous blog entry, the wave of zombie films and games stems from two limitations in computer technology. The first has to do with graphics and animation, and the second with artificial intelligence. The latter is the subject of this entry.

Game designers fall prey to the zombie temptation because virtual actors have to move and adjust to the world around them in real time. Conscious beings do this quite easily. Natural selection weeds out the animals and human beings that fail to adapt to their surroundings. When we see a dog or a human, we expect them to react to the world around them.

Zombies are, by definition, unconscious. Their skulls contain decaying, rotting, stinking grey goo. So it's to be expected that a pack of the living dead won't react -- or won't react quickly -- to the fact you've just tossed a grenade in their direction.

The game player will forgive zombies if they can't figure out how to jump over a low obstacle or open a door. After all, it's just a shambling corpse. You don't really expect it to pick up one of the numerous weapons lying strewn about on the floor and start firing back at you. You would expect that from a living, breathing human being, however.

So, the game designer thinks to herself, zombies it is! It neatly explains the unnaturalness of the animation, the lack of facial expression, and the poor AI found in most games today. After all, it's only a zombie! It has no brain. That's why it wants yours.

The Grinning Mask of Death

So there's a wave of zombie-themed computer and video games on the horizon. This naturally follows a recent wave of zombie movies. Why are the undead all the rage?

I have two complimentary theories that I'm going to discuss in this and the next blog entry. The first has to do with graphics and the second with the artificial intelligence (AI) that determines behavior. First, the visual aspect of the zombie craze.

Computer graphics have improved to the point where we can, fairly accurately, re-create human faces in 3D. The problem is, human beings are acutely aware of other human faces. We can read lips, sense small changes in expression, and most of us can remember a single face years amid thousands of others even after we've fogotten everything else about that indivdual. Millions of years of evolution and social pressure have hardwired facial recognition into the human brain. We're fine tuned for it. We subject the human face to such intense scrutiny that we can sense even the smallest infidelity.

The more something looks like a human face, the more these instincts kick in. While we forgive cartoons or caricatures, if a human-looking face rings false we react badly to it. Ironically, our quest for realism in computer graphics can actually make in-game characters seem less real. The problem is recognized in the robotics field, and has a name: the Uncanny Valley.

A good discussion of this phenomenon, and its relation to anthropomorphism, can be found in Dave Bryant's essay. In it, he considers primarily the visual aspects of the uncanny valley, but I'd like to argue that the uncanny valley is more than just visual; it also relates to motion and movement.

This is, after all, what makes acting an art and a discipline; it's not simply a matter of a human actor memorizing a bunch of lines and saying them on cue. An actor must get the body language right as well. The facial expression is the loudest part of that body language. Most audiences can tell when a smile is real or faked. It takes a really, really good actor to get the whole package -- his movement, voice and expression -- working together to fool the audience into thinking that they're looking at something real.

We can't do this in the computer world convincingly; at least not in real time. It requires teams of artists and animators to get the look. It requires a well-written script and a talented voice actor to get the sound. It requires a director who can put both these things together. It's really, really hard to get right.

What do you get if you get it wrong? Something unnatural and waxen. Something that doesn't look quite real, or quite alive. You get the undead. Enter the zombie game or movie. We see these characters because they are easy to do with current technology. It's easy to animate a stiffly moving character. If your computer-generated extras are going to walk like they have rigor mortis anyway, why not call them zombies?

Sunday, July 24, 2005

Games

Both Paula and I are big fans of games: board games, console games, computer games, and old fashioned running-around-in-the-outdoors games. We spend time on Board Game Geek. We read GameSpot and PC Gamer. We gather with friends to play games on and offline. Now, generally, I'd rather be playing games than writing about them, but I'm going to give this crazy blog thing a try. Over the next few days, I'm going to post a series of articles about game-related topics. Maybe I'll get in the habit of posting, and maybe I'll actually write a few things worth reading. Then again, I've never kept a journal for longer than a week. I blame those darn games.

Friday, March 04, 2005

Speak.

Hey Dean, I thought it was finally time to get a blog. Instead of emailing distractions, we can post distractions online for the world to see. I always wanted to be published... We can start with last week's distraction about speech. It was a cool site.
-Paula
http://www.pbs.org/speak/ Carnegie Mellon gets mentioned in regard to its drama department’s efforts to teach Standard American English. The “from sea to shining sea” section includes articles about Pittsburghese and the differences between R-ful Southern and R-less Southern.