BACK TO THE HERD

GAMING: PURE FUN (?)

For the longest time, videogames were considered at best a small corner of the mass media realm. Years after gaming became a huge industry, some standard Intro to Mass Media textbooks considered it only in passing, usually as part of the chapter on the Web.

 

That was unfair, because for decades the videogame industry has been bringing in billions of dollars every year. A popular game can turn profits as big as popular movies and far, far bigger than the average book. And as we’ll see, games are an important part of big media companies’ marketing mix.

 

Sure, they’re mostly about entertainment. But so are movies. So is music. So are a lot of television shows.

 

So welcome to full-fledged mass medium status, gaming. You’ve earned it.

 

 

Boop. Boop. Boop. Boop.

 

Though it wasn’t technically the first videogame ever, Pong was the first game to make it big both in arcades and in homes.

 

By 21st century standards, Pong sucks. It’s basically a slow-moving, two-dimensional version of Ping Pong. If you’re playing against the computer rather than an actual opponent, the machine doesn’t miss unless the difficulty level is set so low that it deliberately lets you score. Thus either it beats you every time or you beat it only because it’s programmed to lose.

 

Things improved a bit when game designers started producing variations on the theme, such as Hockey (more than one paddle and a smaller goal area) or Breakout (the ball destroys bricks in a wall). But it still wouldn’t exactly pose much of a threat to the latest installment in Halo or Grand Theft Auto.

 

In 1972, however, it was innovative stuff. Allan Alcorn, an engineer for a small startup called Atari, designed Pong as a practice exercise. The company saw marketing potential in the game, which made several key improvements (such as bounce angle) on previous “table tennis” computer games. It was a modest success in the arcade and bar markets. And after Atari struck an exclusive deal with the Sears retail chain, Pong struck it big in the home market.

 

The age of videogames was born.

 

 

You are eaten by a grue

 

For the first three decades of their existence, computers had no graphics. At first they didn’t even have screens (printing information on paper) or keyboards (getting input from punch cards). Even when they did start using monitors, they could only display the crudest of pictures made up of alphanumeric characters, sort of like big, elaborate text emoticons. And of course animation was a complete impossibility.

 

But that doesn’t mean there weren’t any computer games. Text-only computers were ideal for text-only games, also known as interactive fiction. The game begins by describing your location, and you type in commands. For example, the game Adventure started something like this:

 

Computer: You are standing at the end of a road before a small brick building. Around you is a forest. A small stream flows out of the building and down a gully.

 

You: Enter building

 

Computer: You are inside a building, a well house for a large spring.

 

Adventure – also known as Colossal Cave – was the first text game, written in FORTRAN (no easy feat, as FORTRAN isn’t the easiest language to work with) by defense programmer Will Crowther and later improved by Stanford grad student Don Woods. Like many “experimental” programs, it was open source and distributed for free.

 

In 1979 a group of programmers from MIT founded Infocom, the first successful company specializing in text games. The company’s first offering, Zork, was a lot like Adventure. It too involved exploration of a large, complicated cave populated by trolls, dragons and vicious little monsters known as grues. But Infocom’s parser – the software that interpreted commands typed in by players, could understand sophisticated commands such as “pick up the red book and put it in the bag” rather than simple, two-word entries such as “get book.”

 

The company eventually expanded its offerings to dozens of games, including a version of The Hitchhiker’s Guide to the Galaxy co-written by Douglas Adams, the author of the novel. Marketed as an alternative to mindless arcade blasters, Infocom games stressed imagination as the ultimate graphics system. The strategy worked well into the 1980s, when the company was bought up and eventually closed down by Activision.

 

Even in today’s nextgen-graphics-intensive market, interactive fiction remains popular with niche markets.

 

 

The big crash of ’83

 

The videogame industry got off to a rocky start. In 1977 it experienced an industry-wide “crash,” basically because there were so many different versions of Pong in the market that no one company was making money selling theirs. Game manufacturers dumped their products at bargain basement prices, which of course made the whole business unprofitable for awhile. When the dust settled, only Atari and Magnavox remained.

 

Then in 1983 the industry crashed again. Once more part of the problem was over-saturation. Atari dominated the market and assumed that it could therefore sell consoles at a loss and make up for it with game cartridge sales. That left the company vulnerable to any glitches in the market. Atari underpaid its designers. And the games, well, I don’t want to say straight out that they sucked, but they certainly didn’t measure up to their counterparts in arcades. The home version of Pac Man proved to be especially disappointing.

 

But then along came E.T. Steven Spielberg’s movie was a mega-hit at the box office (it made back its production budget in its opening weekend and still holds the number six spot on the list of top-grossing movies in the domestic market). Seeing the potential for a profitable tie-in, Atari paid $25 million for the right to turn the lovable alien’s story into a videogame.

 

Then the company gave designer Howard Scott Warshaw only five weeks to create the game (typical production cycles at the time were closer to six months) so it could be on shelves in time for Christmas. Predictably enough, it was a colossal failure. Initially it sold well, but of the 4 million copies produced, 3.5 million were returned to the company. Atari eventually bought space in a New Mexico landfill and buried the whole mess.

 

In a more robust game industry, this would have been an unfortunate glitch. But with the novelty of home gaming wearing off and Atari’s multi-million-dollar profits turning to multi-million-dollar losses, E.T. was the final nail in the coffin. The gaming business in the United States was out of business until the Nintendo Entertainment System, already a big hit in Japan, successfully crossed the Pacific two years later.

 

 

Going mobile

 

The marketing potential was obvious. Prior to the age of the cell phone, people on the go frequently found themselves on subways, in waiting rooms or between classes with nothing but time on their hands. If videogames could somehow be made portable, the demand would be huge.

 

The challenge was to create something that people would actually want to play. When gaming was new, hand-helds tended to be pricey and play only one game each. Further, graphics were often limited to small grids of lights, requiring a lot of imagination to get “football” or “alien attack” out of them. Such games were popular, but not huge like console games.

 

Nintendo changed the market in 1989 with the Game Boy, the first commercially-successful portable game unit with interchangeable cartridges. The original Game Boy came packaged with Tetris, but players could buy an ever-expanding set of other games and swap them out, just as home systems had done for years. Later improvements included more compact cases and color screens.

 

The Game Boy remained Nintendo’s handheld platform until 2004, when it was replaced with the dual-screen DS. That same year Sony came out with the PlayStation Portable, the first time Nintendo faced serious competition from another system.

 

 

The hungriest man in show biz

 

Like Pong, there isn’t much to Pac-Man. The player guides a yellow circle with a mouth around a maze, trying to eat all the dots without being captured by ghosts. Eat a power-up, and the ghosts become the prey for a few seconds.

 

However, the little yellow guy blazed trails both in and outside his maze. The game was the first to include power-ups, and the use of non-violent “stealth” rather than the random blasting of games such as Space Invaders helped popularize it with women, younger players and parents.

 

But beyond the videogame world, Pac-Man was a genuine cultural phenomenon. Acceptance by people who normally didn’t play games made him the first videogame “character” to be instantly recognizable as the unofficial mascot of the new tide of gaming popularity that was eating so many quarters in arcades that the United States briefly experienced a quarter shortage. He inspired many licensed products, including the top-ten title track from “Pac-Man Fever,” a concept album inspired by the game.

 

 

Plumber power

 

Originally he didn’t even have a name, just another anonymous videogame guy trying to save a princess from a giant barrel-flinging ape. But by his next game appearance, he’d acquired a name (Mario), a profession (plumber), a brother (Luigi) and a following.

 

Mario followed in Pac-Man’s footsteps (if Pac-Man had feet, that is). He’s appeared in several games over the years, from the original side-scrolling platform hoppers to more elaborate 3D race games.

 

However, Mario had a couple of advantages over his pizza-shaped competitor. For starters, he was a person rather than a puck, which made him easier to license for things such as a live-action movie. And unlike Pac-Man, which Namco marketed to game systems from several manufacturers, Nintendo kept strict control over the Mario franchise. If players wanted Mario, they could only get him from one company.

 

Further, gamers everywhere owe a debt of gratitude to the plucky little plumber. In the wake of the Videogame Crash of 1983, Mario’s success in Japan crossed the Pacific and helped revive the U.S. game market.

 

 

Lair of the dragon

 

Game graphics in the early 1980s were a big improvement over the blocky pixels of Pong. But nobody would mistake the flat mazes of Pac-Man or the spidery vectors of Asteroids for the image quality of even the cheapest cartoons.

 

In 1983 a Hollywood producer made the first move to change the way videogames approached visual content. Veteran Disney animator Don Bluth came up with the idea to create an arcade game that used a laserdisc (a larger forerunner of DVDs) rather than computer-generated graphics. The result – Dragon’s Lair – was a combination of interactive fiction and animation.

 

The player guides Dirk the Daring through a series of animated sequences using a joystick to guide the character’s actions. Making the right move at the right time guided the animation to the next sequence. A wrong turn resulted in gruesome animated death.

 

Because the game showed animated clips rather than using computer-generated graphics, game play was limited. Learning the right moves from the wrong ones could cost a few quarters, but once the player had the pattern down the game wasn’t particularly challenging.

 

On the other hand, the adventures of Dirk the Daring proved that there was a demand for games that went beyond simple pixel graphics, paving the way for visuals that looked more and more like the real world (or at least the movie version thereof).

 

 

Stealin’ artifacts and lookin’ good doing it

 

By the mid 1990s game graphics were sophisticated enough to at least vaguely simulate a three-dimensional, real-world environment. But this new generation of games lacked an iconic character, a Pac-Man or Mario that people could instantly recognize.

 

Enter Lara Croft. She made her debut as the protagonist of the first game in the Tomb Raider series in 1996. Eidos Interactive took a risk by releasing a game with a female player character, but Croft was designed to appeal to men as a physically-attractive woman and to women as independent and intelligent. In a market full of games in which women often played roles no more significant than being kidnapped by giant apes and lying around until rescued by plumbers, Croft was an innovation.

 

She also became the world’s first imaginary sex symbol. She appeared on the covers of several celebrity magazines, her status as a cultural icon helping to sell games and popularize licensed products such as the Tomb Raider movies. Her status has caused controversy, as many critics argue that her physical “attributes” overshadow her potential as a positive female role model.

 

 

A universe of alter-egos

 

The Sims began life as tiny dots in the large, player-created cities of SimCity. The original game and its sequels appealed to wannabe urban planners, but for a big hit Maxis (and later Electronic Arts) needed to get closer to “street level.”

 

In The Sims, players play people with at least some connection to the real world. Alien invasions and other excuses for mass carnage are few and far between. Instead, the game simulates life in the suburbs. Players buy homes, clean house (or not), fix and eat dinner, go to the bathroom, just about anything real people do. Other characters in the game can be generated by the game or controlled by other players via the Internet.

 

Expansion packs allow players to expand their options, get pets, go out on dates, even perform magic. The game also has two sequels and a Medieval spin-off.

 

The Sims appeals to a new audience: casual players. Its “ordinary life” environment is accessible to people who weren’t interested in running through dungeons killing everything that moves. The Sim strategy has even crossed over into social media with combination game and networking sites such as Second Life.

 

 

Madden mania

 

For the longest time videogames and professional sports simply didn’t mix. In the early days the limits of game design caused major problems. Trying to get little pixel lumps to move around a screen was nowhere near as good as watching an actual football game.

 

Even when graphics started to improve in the mid 1980s, sports games lagged behind their counterparts in the shooter and platform-hopper realms. They tended to be elaborate, imaginary roto league exercises in team management with game simulation tacked on as an afterthought. The result was neither like watching nor playing in an actual game.

 

Electronics Arts Sports ushered in a new sports gameplay with John Madden Football, a franchise of games first released in 1988. A former NFL coach and current broadcast commentator, Madden insisted that he wanted EA’s game with his name on it to be as realistic as possible.

 

Though the development process began in 1984, the first version of John Madden Football didn’t hit shelves until four years later. Limitations on home computing power initially made it impossible for the game to play more than five or six players per team, while Madden’s insistence on realism required the full 11.

 

Over the years the realism steadily improved and game play became more sophisticated. In 2003 the company began the practice of releasing a new version every year, which allowed for up-to-date player rosters as well as additional features such as franchise mode (playing an entire season or even multiple seasons rather than just single games). As soon as audio technology allowed, Madden himself as well as his real-life broadcast booth partners recorded commentary to go with the games, a practice he kept up until his broadcast retirement in 2009.

 

And of course graphics steadily improved as well. Indeed, in answer to the you-are-part-of-the-game visuals in Madden and other sports games, TV broadcasters have changed their camerawork to include views of the field never before possible.

 

 

Wow, those birds are angry

 

Games are expensive. Small wonder, considering that they tend to require developers to create entire elaborate, three-dimensional worlds full of sounds and images. Actors have to be hired for voice work, which can be pricey if celebrities are required. Programmers and other technicians don’t come cheap, either. And on top of everything else, successful games have to do something that other games don’t. Thus high prices tend to be the rule rather than the exception.

 

Exceptions can be found, however, particularly in the world of “Flash games” (named for the software used to generate many of them). These games are simple, typically two-dimensional throwbacks to the early days of gaming. Game play doesn’t require hours to master or involve complicated button combinations. Celebrity voiceovers are hard to find. But then so are the large price tags. Many of these games can be found online for free or for small fees (usually less than $10).

 

The current champion in this category is Angry Birds, created by the Finnish game company Rovio. Game play is extraordinarily simple: the player uses a slingshot to fling birds at structures full of egg-stealing pigs. It was originally designed for the iPhone, though it was subsequently released for several other mobile devices.

 

The Birds currently have more than eight million followers on Facebook, and they’ve inspired many pop culture adaptations from green pig shower shoes to an illegal knock-off theme park in China.

 

 

Down at the arcade

 

Many media begin life in public before finding their ways into homes, and videogames are no exception. Indeed, a spot was reserved for them before they even existed.

 

Starting in the 19th century, penny arcades provided customers with access to a wide variety of coin-operated amusements. Big draws included pinball, automatic fortune tellers and redemption games such as shooting galleries and skee ball. Needless to say, eventually the “penny” part got dropped from the name.

 

When videogames first hit the consumer market, they required too much hardware for home use. But their cabinets fit in nicely next to the pinball machines in arcades. As game quality improved, they swiftly became the main draws. The age of the video arcade was born. The era lasted from the late 1970s to the mid 1990s, when home game system quality equaled and then surpassed what could be offered in a stand-alone system.

 

However, cabinets still offer advantages over home systems. Some games require – or are at least more fun with – special equipment such as rifles, punching bags or light-up dance floors. Stand-alones also continue to flourish in bars and other social settings where video golf and trivia games appeal to people having fun in a group.

 

 

The world in a box

 

Early on, game manufacturers produced home units. Trouble was, at least to start they weren’t very good. They played Pong and … well, they played Pong. Once the novelty of having an “arcade” in the living room wore off, the console got stashed in the closet and forgotten.

 

But then along came the “second generation” of home gaming consoles, most notably the Atari 2600. Rather than suffer the limits of whatever games were built into the “motherboard,” these new boxes accepted cartridges. Thus when you got tired of one game all you had to do was pop the cart out and replace it with something else. It was a considerable savings over buying a whole new system just to change games.

 

The new technology also introduced a new business model to the gaming industry. Companies frequently sold their consoles for less than the boxes cost to develop and manufacture. Then they’d make up for the loss by taking a percentage of the money paid for all the games created for their systems. Though this “loss leader” scheme hasn’t always worked flawlessly (it contributed to the depth of the Crash of 1983), companies still use it today.

 

The big limitation of early cartridge systems was memory. Atari 2600 carts packed only 2 KB of ROM (by comparison, the entry you’re currently reading, if saved as a Word file, would be nearly 40 times that size). Thus games were a bit clunky, often merely slow, unattractive shadows of their relatives in arcades. But by the late 80s technology had improved enough to make home games worth playing, and by the mid 90s they equaled and then passed arcade games in the quality department.

 

Discs eventually replaced cartridges. And then discs were largely replaced by Internet downloading, which of course allows for file sizes larger than discs can hold. So for the first time since the 1970s the limit on game size, quality and complexity isn’t the transfer medium but the console itself.

 

 

Games on computers

 

“Two hundred dollars for a lousy game? Do you think I’m made of money? Why don’t you just play Monopoly with your little sister? You can do that for free.”

 

Every kid in the United States has at one time or another fallen victim to the Classic Parent Videogame Block. Even if everyone else in the universe already has the latest and greatest, a box that did nothing but play videogames isn’t always at the top of the budget list for a family trying to make ends meet.

 

A computer, on the other hand, has more than one use. Mom and Dad can buy one for serious grownup stuff, and when they’re done the kids can use it to play games. Or let’s be honest: grownups can buy computers to do work but end up playing games themselves.

 

Games have been available on personal computers for as long as there have been personal computers. Indeed, some of the earliest home computers such as the TRS-80 (known affectionately by its users as a Trash-80) were popular among electronics hobbyists specifically because of their ability to play simple videogames.

 

As PCs gained popularity, PC games surfed the wave. But problems arose due to the non-uniform technology of Windows computers. Many games – especially those with complicated graphics – required specific hardware configurations not present in all PCs. Macintosh hardware was more uniform, but because the Mac market was smaller it was less attractive to game marketers. Many games available for the Mac were “ports” of PC games, which could also cause technical problems.

 

In the 21st century PC prices have come down far enough that even semi-serious gamers can purchase a machine customized with faster graphics cards and other hardware mods to make game play more enjoyable. Indeed, players with computers customized for gaming can gain a significant advantage over opponents using general-purpose machines.

 

 

Massively multiplayer

 

Not entirely satisfied with your life? Wish there was an alternate universe out there someplace where you could lead a more exciting existence? Then MMORPGs (or MMOs for short) may be the place for you.

 

Massively Multiplayer Online Role Playing Games are the Internet descendants of paper-and-dice games such as Dungeons and Dragons. The bridge between the “dead tree” games and the online versions were MUDS – Multi-User Dungeons – popular on bulletin board systems for awhile.

 

Probably the most famous MMO is World of Warcraft. WoW players create characters – called avatars – for themselves. Then they use their avatars to interact with other users in the game’s online environment. Non-Player Characters controlled by the game set up quests for players, sending them in groups on missions to kill bad guys, recover objects and amass virtual wealth and experience.

 

A successful MMO can be a massive moneymaker for its creator. Companies typically charge not only a one-time purchase price for the software to play the game but also a monthly fee to stay connected to the world. Further, in-game economies can be connected to real-world funds, allowing players to exchange actual money for virtual cash or other valuables in the game.

 

Multiuser games have even crossed over into the realm of social media. The Second Life world is set up like a game with avatars and virtual environments. But players use it more for social interaction and information exchange than for dungeon crawling or dragon slaying.

 

Thus MMOs have become a high-dollar corner of the gamin industry. They bring in more than a billion dollars per year. WoW alone currently has around ten million subscribers.

 

 

From concept to console

 

Good game design is a tricky business. In many ways, development starts out the same way in the game world as it does in movies. The project needs characters: good guys and bad guys. The good guys need a goal of some kind, and the bad guys get set up to keep them from reaching it.

 

Obviously complexity will vary. Tiles in a puzzle game don’t need a lot of character traits. But the characters in an adventure game need even more personality than their movie counterparts, because players will spend a lot of time interacting with them.

 

Once the idea is solid, bust out the software. Big money game design companies employ legions of programmers, modelers, artists, writers and so on. Smaller operations require a jack-of-all-trades approach. Fortunately, creative types can use game design software packages to do a lot of the “low level” code work.

 

Once the “rough draft” is complete, extensive testing has to be done. Testers check not only for errors in the system (glitches that can cause the logic to fail or the game to crash) but also for playability. Does the game move smoothly from challenge to challenge? Do the characters play their parts well? Is it fun?

 

Assuming the game survives testing (and many don’t), the marketing folks take over and try to figure out how to convince potential players that the new game is the greatest thing ever.

 

 

Game genres

 

In the gaming world, players (and companies) differentiate by genre. Unlike movies (where genre is determined by what kind of story is being told), games divide up based on how the game is played. Eight quick examples (of the dozens out there in the market):

 

Shooters – Back in the Space Invaders era, most games were simple shooters. Now the genre is divided into two parts. First Person Shooters (FPS) show the world from the point of view of the player’s character, while in Third Person Shooters (TPS) the player character appears on the screen (usually toward the bottom) and the viewpoint follows him or her around. No matter what the type, the main goal of a shooter is – surprise surprise – to destroy things.

 

Role Playing Games – RPGs require more complex interaction with the game’s environment than a shooter does. Behavior in the game is more like actual human conduct in the real world. And in MMOs, the player interacts not just with computer-generated characters and situations but also with other players.

 

Puzzles – Though they don’t generally feature a lot of plot or character development, puzzles are great for when you’re stuck in the waiting room at the dentist’s office with nothing but a smart phone for company. Game companies like puzzles as well, because they don’t generally cost as much to develop.

 

Platform hoppers – These games aren’t as popular as they used to be. As the name implies, players advance to the next level by jumping from spot to spot.

 

Simulators – Starting with flight trainers back before computers even existed, simulators try to at least semi-realistically reproduce some aspect of the real world. Flight simulators are the most common.

 

Sports – All kinds of sports have been turned into videogames. Big professional sports (NFL, NBA, MLB) tend to be the most popular. Increased graphics quality in recent console “generations” has greatly improved the realism of game play. If nothing else, the players now look at least a little like their real-world counterparts.

 

Music – Games such as Guitar Hero invite players to follow along with popular music, using their controllers to roughly approximate playing instruments. Some games even come with instrument-shaped controllers.

 

Strategy – Found more often on computers than on consoles, strategy games require complex manipulation of elements. They can be as “simple” as a game of chess or Risk, or they can be more complicated recreation of cities, businesses and battles.

 

 

Sequels, franchises and licensing

 

What makes a game easier to sell? Connections to something else people are already buying. Thus was born the world of sequels, franchises and licensing.

 

Big media companies love synchronicity, ties between one product and another that helps them both sell better. So if a corporation’s movie division is getting ready to come out with a big summer blockbuster superhero movie, you may rest assured that the videogame division will have the game version on shelves in time for Christmas (if not before). Even if it isn’t the best game, it will still likely sell well.

 

Even absent ties to other media, new releases often have connections to older games. As of this writing, the Angry Birds franchise has a dozen titles across half a dozen platforms. Expansion packs also increase the playability of a game that’s already popular with players.

 

Popular games can also be licensed to other markets. Thus Angry Birds games spawn books, a big-box-office movie, plush toys, candy and so on.

 

 

The Big Three

 

Overall the videogame market has some room for small developers. Particularly in the post-disc world of game delivery, marketing games doesn’t necessarily require huge budgets. However, large corporations dominate the portions of the industry that require large investments. No big surprise there.

 

The console market is a perfect case in point. It’s dominated by the Big Three, large companies with little competition other than each other.

 

The “elder statesman” of the consoles is Nintendo, a Japanese company that began life in the 19th century manufacturing playing cards. Emerging from the ashes of early 1980s industry shakeups, the company’s NES box was the industry’s first long-term success. It also had a string of hits with portable devices from the Game Boy to the DS.

 

To be sure, the company hit a bump or two along the way (anyone remember the GameCube?). But Nintendo’s latest, the Wii, is doing okay (though not quite keeping up with the competition).

 

After decades of success in other corners of the media world, Sony entered the game market in 1994 with the original PlayStation. Its follow-up, the PlayStation2, is the best-selling game console in history. Its portable PSP and latest console (the predictably-named PlayStation4) have also been big victories for the company.

 

Washington-based Microsoft is the only non-Japanese member of the Big Three. Fortunes from sales of the Windows operating system and other personal computer software bankrolled the company’s effort to develop and market the Xbox. The Xbox One X is the company’s current entry in the console market.

 

 

A team effort

 

As already noted, the game development divisions of large companies tend to hire specialists, people who are great at one particular part of the design process rather than merely good at everything.

As most big company games start more or less from scratch, they require programmers to write the code. This specialty can be divided into sub-specialties, coders who specialize in game play, special effects, environments and so on.

 

Before the coders can get started, people on the creative side have to come up with a concept. This requires writers to invent plots and characters. It also requires artists to decide what game elements will look like.

 

Once the story and visual “models” are in place, techies get busy making it work. Modelers and other CAD specialists create computer versions of the characters and environments. And then of course the programmers build the game’s structure incorporating the artists’ designs and the writers’ plots.

 

When the game is complete, professional testers make sure it works the way it’s supposed to. And then marketing pros take over and (hopefully) get it to sell.

 

Oh, and just for what it’s worth, some folks out there actually get paid to play videogames. Like other professional sports, you have to be good at what you do in order to succeed. Unlike other professional sports, pro gaming isn’t necessarily a quit-your-day-job career option.

 

 

MSG CASE STUDY: GENDER

Gamergate

 

App designer Zoë Quinn (photo below) got an unpleasant surprise after she released Depression Quest, a work of interactive fiction. Though the game itself received mostly positive reviews, it sparked a backlash that soon blew up into the mess called “Gamergate.”

 

The problems started when an ex-boyfriend posted a long, harsh blog entry about her. Among other things, the post claimed that her game received positive press from a reporter she was dating (even though in truth he never wrote anything about the game).

 

From there things spiraled out of control. Using the #gamergate tag, trolls began harassing her. They doxed her (publishing personal information such as her address and phone number). They swatted her (using 911 to make fake reports about crimes being committed at her home). Many messaged or called her directly, making threats ranging from rape to murder.

 

Gamergate spread from there, targeting other women in the game design and journalism businesses. It called attention to serious gender issues in the gaming industry, including the under-representation of women in the ranks of game developers and the vulnerability of women in general to online harassment. These weren’t new issues, and they continue to be serious problems. But Gamergate called public attention to just how nasty misogynist trolls could be when they put their tiny minds to it.

 

 

 

 

Software development kits

 

If you’re trying to break into the hardware world as an indie by developing your own console system, well, good luck with that. On the software side, however, there’s more room for folks working on their own or in small businesses.

 

Indeed, the Big Three actually encourage small game creators by providing software development kits. These kits help people who have a good idea for a game but may lack the low-level programming expertise to actually make that idea a reality. Even the best SDK won’t exactly create the game for you, but with a little tech savvy and a strong work ethic, game development is at least within the realm of possibility.

 

That isn’t to say that even a great concept is an instant ticket to easy street. After you get your game designed and debugged, you’ll still face stiff competition from all the other software designers trying to sell their games, some of whom are big media corporations with a lot of money for advertising and other promotions. Even though you may not get a four-course dinner, at least you can sit at the table.

 

SDKs are also available for popular computer operating systems, such as Windows and Mac OS. And of course if you’re ready, willing and able to learn more about writing computer code, you may be able to achieve “custom” results not readily available with an SDK.

 

 

The Tetris Effect

 

How much is too much? It’s a question that’s plagued humanity since the days of the ancient Greeks. And for folks who don’t care for videogames and want to see them restricted, the question of addiction is a popular line of attack.

 

Gaming can be addictive, which makes it the same as just about everything else in the world. However, videogames have some traits that pose specific problems, one of which is informally known as the Tetris Effect. Extended periods of game play can lead to minor psychological disturbances in which elements of the game (such as the tiles in the popular puzzle game Tetris) can still be seen even after the player stops playing. In some cases they even show up in dreams. These minor-league hallucinations aren’t dangerous, but they do tell us something about the psychological depth of gaming.

 

Weird psychology aside, some games are just plain, old-fashioned addictive. While some games are “throw-aways” (play ’em once and you’re done), others invite multiple play-throughs. And others – especially MMORPGs – don’t really have a beginning, middle or end. So players can keep coming back again and again, getting a new experience every time.

 

So how much is too much? Mental health professionals disagree on the answer, as you’ll find in some of the links below. Personally, I’m no mental health professional. However, I did read a story about some folks at MIT who designed a “World of Warcraft Hut.” This booth-sized structure includes all the computing power a serious gamer can use. It also has a stove and a fridge. And the seat doubles as a port-a-potty. For my tastes, letting the game dictate when and where you eat is a yellow-light situation, and not getting up even for long enough to take a bathroom break moves you right into red.

 

 

Attack of the jerks

 

Interpersonal communication depends heavily on visual clues. Though words are important, facial expressions and body posture also do a lot of the job. Even if a friend tells you she’s happy, the angle of her eyes or the tone of her voice can tell an entirely different story.

 

Such clues are absent in the online world. With little to go by except text messages and avatars, people interact differently in online environments such as MMORPGs than they do in the real world.

 

Most people have little trouble adapting to the new environment. But for some, the online world frees up their inner jerk and they become known as “trolls” or “flamers.” They insult other players, treat even members of their own teams with disrespect, and just generally behave in ways that would draw swift, negative response in the real world. Some even software hacks that allow them to cheat, beating opponents badly on an uneven virtual playing field.

 

To be sure, a certain amount of abnormal behavior is expected online (how many bad guys have you stabbed to death at work this week?). But many players agree that online jerks cross the invisible line. Especially in games that require players to cooperate in order to achieve goals, a bad apple can really spoil the experience for everyone.

 

So what’s to be done? Some critics suggest using fees in place of visual clues. If everyone you’re playing with agrees that you’re being an ass, the game automatically charges you extra. Or conversely, if you’re well-liked the game might give you a discount.

 

 

The ESRB

 

The gaming industry is big. Its products appeal to both adults and children. And as we’ll see, game content isn’t subject to a lot of government regulation. Such an environment is ripe for intense industry self-regulation. And the Entertainment Software Rating Board is just that.

 

Back in the early days, concern about sex and violence in games such as Pong was fairly minimal. But with the advent of games such as Mortal Kombat – in which a “cheat code” unlocked a feature that dramatically increased the level of blood and gore – detractors began to grumble. Initially the industry response was less than uniform. But eventually the big companies got together and set up the ESRB.

 

The board’s job is to assign ratings to games. The ratings are divided into two parts. The “big letter” rating helps consumers distinguish between games aimed at children, games designed for everyone, and games intended only for older players such as teens and adults. The system also includes an “AO” rating that indicates content inappropriate for anyone under 18 years old (though critics accuse the ESRB of applying this rating only to games with sexual content without regard to the amount of violence in the game).

 

The second part of the rating is “content descriptors.” These provide more specific information about controversial content ranging from sexual violence to smoking.

 

The ratings process is tricky. Companies pay a fee and submit video of the most extreme parts of the game, and the board turns the footage over to three trained raters. If the company doesn’t like the rating the game gets, it can redo the game and resubmit or appeal the rating to a committee composed of “entertainment software industry representatives.”

 

So why do companies bother to pay the $800 to $4000 to get ratings slapped on their games? Because many big retailers won’t carry games unless they have an ESRB label on the cover. Further, the AO rating is a kiss of death outside limited niche markets. Retailers such as Amazon and Wal-Mart won’t carry AO games, nor will the Big Three license such games for their consoles.

 

 

MSG CASE STUDY: SEX & VIOLENCE

Hot Coffee

 

Rockstar Games’ Grand Theft Auto series already had a well-deserved reputation for mature themes even before the release of the seventh game in the series. But GTA San Andreas introduced a new wrinkle.

 

The game received an M rating from the ESRB, thanks in part to “coffee” missions in which the player could go on a date ending with an invitation to come inside for coffee. If the player accepted the invitation, the mission would end with a view of the exterior of the house and the sound of sexual intercourse.

 

However, the game also included a hidden mini-game called “Hot Coffee.” Players who knew how to access the secret part of the game could follow the character into the date’s house and control him as he had sex with her.

 

At first Rockstar claimed that hackers created Hot Coffee as a modification not included in the game as rated and sold. But further investigation proved that the mini-game was part of the game itself, not an after-market hack.

 

The ESRB changed the game’s rating to AO until Rockstar re-released it with Hot Coffee taken out.

 

 

 

 

Schwarzenegger vs. Free Speech

 

Unsatisfied with the effectiveness of the gaming industry’s self-regulation, a handful of states have tried passing laws restricting game sales to children.

 

California’s experience is typical. In 2005 the state’s legislature passed a law imposing a $1000 fine on anyone caught selling a violent videogame to someone under the age of 18. The law defined as “violent” any game “in which the range of options available to a player includes killing, maiming, dismembering, or sexually assaulting an image of a human being.” Loosely construed, such a restriction would apply to just about every M game on the market and a fair number of T games as well. The law further required all “violent” games to bear a two-inch-square warning sticker on the cover.

 

Gaming professionals had a sarcastic laugh when Gov. Arnold Schwarzenegger – the star of several R-rated, violence-packed movies – signed the bill into law. Then game producers sued the state, alleging that the new law violated their First Amendment right to free speech.

 

The United States Supreme Court agreed with the game makers. Observing that violence is a part of many childhood media experiences – such as Grimm’s fairy tales – the court found that games couldn’t constitutionally be restricted any more than any other medium of mass communication.

 

Passing such obviously unconstitutional laws can cost a state serious money. In addition to the expense of passing a new statute and defending it in court, the government can end up ordered to pay the costs incurred by the game companies during their legal challenge. In Illinois the state actually had to take money away from cash-strapped government programs in order to pay a large penalty incurred as a result of passing an anti-game bill.

 

In 2005 a handful of US Senators tried to pass a similar law, the Family Entertainment Protection Act. It died in committee, making it worthy of mention only because one of the bill’s co-sponsors was 2016 Democratic Party presidential candidate Hillary Clinton.

 

 

The suits people file

 

Like any other multi-billion-dollar industry, videogames are the subject of a fair number of lawsuits. In the early days (and even today) suits frequently revolved around copyright problems, when one company had a big success with a game and a competitor came out with something that was just a little too close for legal comfort.

 

As games became more profitable and began to draw celebrity endorsements, the legal hassles with endorsement deals weren’t far behind. Several athletes and musicians have sued over characters in games that “borrow” their names and images without their permission (or to be more specific, without paying for permission). But endorsement deals are two-way streets. When Beyonce allegedly backed out of a deal to put her name on a new music game, the developer sued her claiming her move had cost them millions in potential sales.

 

And of course some plaintiffs always stand ready to accuse the media of ruining their lives. Complaints have ranged from “a game encouraged someone to shoot me” to “the company should have warned me that playing the game would cause me to develop a time-consuming addiction.”