The most superficial item on this list is perhaps also the most visible. As the gaming industry has grown, it has increasingly been subjected to games based on movies -- more often than not, terribly over-simplistic recreations of movie characters and scene, meant mostly just to carry the name of the game and sell copies for cash. There are, of course, numerous exceptions to this rule, but there's a reason why the latest 'Madagascar 2: Escape to Africa' game isn't mentioned in any best-game-ever discussions. But as the industry has grown, the trend has reversed course as well, leading to the spawning of several movies based on video games. The Wizard, a 1989 movie, had video games in basically a co-starring role, but the movement truly started in the early 1990s with the campy Super Mario Bros. movie -- perhaps also the movie that gave game-based-movies their reputation as being as bad as their movie-based-game counterparts. But the wildly popular Street Fighter and Mortal Kombat series would largely resolve that, reaching critical acclaim, while the Pokemon movies that followed quickly thereafter cemented game-based-movies as a viable medium. But the game adaptation movie genre would really take off three years after the first Pokemon movie with the release of Lara Croft: Tomb Raider. This movie was the first game-based-movie to actually star a big-name actor, and was also the first to reach a non-gaming audience. Starring Angelina Jolie in the title role, Lara Croft: Tomb Raider may have been panned by critics (but since when do we care what they say?), but became the highest-grossing game-based-movie of all time. It spawned a sequel (with another one rumored), and opened the door for more recent game-based-movie blockbusters like Resident Evil, Silent Hill and Max Payne.
Moving past that, on to where games have made a bit more of a practical difference in people's lives, starting with education. Video games as an educational tool would be listed higher on this list of the adoption percentage was higher, but counter-intuitively games have educational games have largely been on the decline over the past several years. Back in the earliest days of PC gaming though, there was not the distinction between 'educational' games and 'entertainment' games that exists today -- the two were largely grouped together, which is why games like Math Blaster are still listed in the GameFAQs annals. In terms of adoption, Oregon Trail is largely recognized as the most widely-used educational game of its time, although its educational value might be somewhat dubious compared to games like Reader Rabbit and Super Solvers. There's good news on the horizon though: there's an enormous field of research being conducted into identifying how games can be utilized in educational systems to engage learners in ways that are currently impossible. We're not talking about installing Gears of War into every high school classroom -- but it's been shown that students learn and retain far better when they are "discovering" rather than being "informed". Why not use games to put the student in the role of the scientist, or an explorer, or a citizen of another civilization? Instead of informing them what causes rivers to be polluted, put them in a virtual world where it's their responsibility to discover why the virtual river is being polluted? Instead of telling them what life in the colonies was like, why not let them explore a colonial virtual world? Instead of dictating to them how the explorers mapped out the new world, why not give them a virtual world to map out? These aren't pipe dreams -- all three of these scenarios have been created and tested on students, showing great results compared to the stupidly standard 'lecture' method.
When discussing the sensitive subject of school shootings in the United States (or around the world), one specific example always comes to mind: Columbine High School. Since the Columbine tragedy, there have been dozens more shootings in the United States, but aside from perhaps the more recent Virginia Tech murders, none still sticks out as much as Columbine. In the days following the tragedy, numerous reports and speculations about the motive for the murders came to mind -- and one prominent suggestion was that the shooters were inspired by the violent first-person shooter game Doom. One of the shooters was known to be a fan of the game, as well as a prominent level designer in online communities, and among the ill-informed speculations born shortly after the shooting was the idea that he actually mapped out the school in the game to practice. With time, however, this speculation -- as well as most others attempting to draw a connection between violent behavior and violent video games -- proved to be false. I'll acknowledge here that I'm not unbiased on the subject, but in my view the majority of those that attempt to draw a connection between actual violence and gaming violence (Jack Thompson the most prominent, but also notable professor and author David Grossman) do so based largely on anecdotal, intuitive evidence. Many actual academically-sound studies have been conducted, not with the goal of proving one way or another, but aiming to actually find an answer from an impartial original point of view. These studies have been conducted by Harvard University, several prominent psychological and medical journals, and the United States Public Health Service, and all have come to the same conclusion: there is no causal relationship between video game violence and actual violence. But the speculation that there might be, the initial blame laid on video games, and the on-going suspicion that surrounds the industry still represents one of the most intriguing relationships between video games and the outside world.
There's a close relationship between this example and the previous one, but it's a very important distinction. The previous item described speculation that individuals were inspired to real-world violence by virtual violence -- this item describes instances where individuals really did resort to real-world violence in response to something that occurred in the virtual world. More directly, "why did you hit him?" "Because he killed my character!" It's the virtual equivalent of "why don't we take this outside?" Such instances are surprisingly common. In Novosibirsk, Russia, a teenager beat a friend to death over the result of a Counter-Strike game. Elsewhere in Russia, in-fighting among members of an online LineAge II clan led one of the clam members to beat another to death. And elsewhere in Russia again, two rival clans met in real life, leading to the death of one member. This isn't solely relegated to Russia either: in Korea, a player in an internet cafe stabbed another player to death for killing his character in older MMO. These tragedies reflect a broader trend in any competitive activity -- there's always a temptation for things in a competition to get dicey, for fights to break out ("I went to a fight and a hockey game broke out!"), and for participants to generally resort to physical violence to compensate for losses within the game. But there's some element of video game rivalry that appears to lead to a more violent form of retaliation. It could be the inflated level of importance as individuals take roles of virtual near-royalty, bestowing a higher sense of entitlement. It could be that the very nature of the violence in the games carries over to a more violent real-world counterpart (though the above studies might argue against this). But personally, I suspect that it's a result of the 'new' rivalry theater (the real-world violence) differing more substantially from the original medium -- whereas fights in sports are physical altercations resulting from an equally physical competition, real-world gaming-inspired brawls take it to a new dimension where the participants are not entirely sure what the heck they're doing, so things are much more prone to get out of hand. But that's just my blind speculation.
Over the course of human history, we can see a very common and very simple trend among human interests: interests lend themselves to hobbies, and hobbies in turn become professions. As a particular hobby becomes more and more popular, those that are recognized as the best in the world at performing that hobby are elevated by the hobby's other fans to a level where they can actually support their livelihood solely through their hobby, either by advising others or by performing their hobby for spectators. This trend is everywhere: there are numerous professions in the world that don't improve society in a practical, tangible way, but are rather just entertaining and enlightening: professional athletes, fictional authors, artists of all kinds -- all represent instances where a niche interest became mainstream enough that its expert practitioners could support their livelihood solely through the hobby. In recent years, video games have become no different: there are individuals who actually are good enough at certain video games to secure an audience to watch them play, and money from sponsors who want to reach that same audience. With the emergence of leagues like the Cyberathlete Professional League and Major League Gaming, playing games professionally is becoming more and more a real method for gaining income. Today, perhaps the most known example is the professional Halo 3 tournaments and leagues that exist in forms around the world -- but numerous other games can be played professionally as well, the most well-known among them being StarCraft (still wildly popular in Korea, arguably the birthplace of true professional gaming) and Madden, which sees a yearly tournament televised on ESPN. Income garnered from these activities varies widely: sponsors often pay for ad placement on the players' clothes; tournaments and such often award substantial cash prizes to the winners; and, more and more, leagues and teams are starting to pay players a true traditional salary for their skills.
There's a small but extremely important distinction to be made between this item and the previous one. Both professional gaming and actions like gold farming represent a similar real-world develop: the ability of individuals to actually support their livelihood because of the existence of a video game they had no part in creating (as opposed to a programmer who is paid to create the game). However, in professional gaming, gamers are recognized for their simple skill and are observed. praised and paid for their ability, like professional athletes. Gold farming, while also providing a means for individuals to support themselves through video games, comes in a drastically different form: these are not people you watch and sponsor for playing, but they are those that are able to make a real, tangible difference in the play experience of other gamers -- for a nominal fee, of course. If you're unfamiliar with the term, gold farming refers to players, almost always in an MMORPG, that go around in-game and spend lots of time gaining in-game money, which they then sell to other players for real-world money. That might seem like an incredibly unprofitable business -- after all, who is going to pay minimum wage for someone to earn them World of Warcraft money? But gold farming's viability hinges on a simple real world concept: exchange rates. For $10, would you buy from another player the amount of gold it would take you a full day of playing to earn? In China, that's the equivalent of $80 instead -- so what for you is a quick convenience is for them a viable business strategy. Of course, Blizzard (and other MMO companies) tries to prevent this practice while China (and other countries) tries to tax it -- but the very nature of the task makes it very difficult to determine when a player is 'gold farming' and when they're legitimately playing and trading. Which begs the question, has anyone ever thought of putting World of Warcraft on some computers in the poorer African countries, where $10 is enough to feed a family for a week? It sounds crazy, but why not?
In recent years (up until the release of the Nintendo Wii), video games have taken a lot of criticism for allegedly playing a role in the increase in childhood obesity. According to the theory, children are playing indoors with their video games -- a decidedly non-physical task -- when they would otherwise be running around, shooting hoops or playing hop-skotch or whatever it is that kids would do nowadays if they didn't have those gosh-darn evil video games. But to a certain extent, that changed with the release of Dance Dance Revolution in the US 2001. It didn't single-handedly change the video game landscape, but it did show that a video game could be just as entertaining (if not more so, judging by the popularity that Dance Dance Revolution sky-rocketed to) while still requiring the player to move their body. But in its early days, Dance Dance Revolution represented a somewhat tough trade-off -- yes, kids were finally getting their exercise, but they were paying upwards of $5 just to get it. Konami had to love it, but that quickly became a pricey trade-off. That changed drastically in 2006, though. In 2006, Konami partnered with West Virginia public schools to roll their units out into every school in the state, to be used in school gym classes and to be available for student usage during free times. From the student's perspective, this was awesome -- video games in schools, what could be better? But the school district saw it differently -- here was a way to get students exercising without mandating it in a classroom format. Since then, Dance Dance Revolution systems have been incorporated into numerous other schools across the country and around the world, and in many ways paved the way for the eventual release of the Nintendo Wii's recent suite of exercising-inducing games like Wii Fit and Wii Sports.
The majority of the items on this list thusfar have really been trends rather than specific, discrete events. Most of them are marked by a certain turning point (the founding of the first professional gaming league, the first video game movie), but were really a long-term change. The top three on this list, though, are truly discrete events, starting with a well-publicized plague in World of Warcraft. In World of Warcraft, a new dungeon was released featuring an enemy that could plague characters with a disease that would spread from person to person. The disease, called Corrupted Blood, was intended to stay only within the dungeon, but through various methods it got out and started infecting entire servers. That's where the real-world aspect came in: researchers quickly noticed that the pattern of spread for the disease mirrored the spread of diseases in the real world. It started in a 'remote' portion of the world, and was spread by only a handful of characters entering inhabited areas. From there, it spread like wildfire person-to-person -- all behaviors that near-exactly mirrored trends in real-world epidemics. At a more discrete level, researchers also noticed (mostly from anecdotal evidence, as health researchers typically aren't avid World of Warcraft players) players taking the roles that have been observed in the real world: some players quarantine themselves, while others volunteer to put themselves at risk to try to health other players, and still others take the presumptive attitude of "If I'm sick, others should have to be sick too!" and intentionally spread the disease. In the end, a relative lack of available data prevented any true conclusions to be drawn about epidemic modeling based on the results of the Corrupted Blood incident -- but that didn't stop numerous research papers, talks and demonstrations from being given about the possibilities for modeling real-world epidemic situations using online games like World of Warcraft, suggesting that perhaps the solution to some disease in the future will be born from gaming.
While the Corrupted Blood incident was decently-publicized (especially among the gaming world), this incident is substantially less known, though arguably even more interesting. Second Life, if you aren't aware, is just what its name describes -- a second life for the player. The player takes on the role of just a normal person (an 'avatar') in a virtual world -- there is no plot, no monsters to battle, no level-up system: you're just a person. Players meet, go on virtual dates, and get virtually married. Sometimes, they get virtually divorced. That was the story with a woman in Japan who logged on to Maple Story (though in this instance, treated it more like Second Life) one day only to find that she -- or, rather, her virtual avatar alter-ego -- had been divorced by her virtual husband. Hell hath no fury like a virtual woman virtually scorned, and so the woman killed her virtual ex-husband's avatar. Based on a combination of factors, including the means with which she accessed and killed her ex-virtual husband's avatar, she was actually arrested, transported 620 miles to be detained in the police precinct that presided over her virtual ex-husband, and held for trial. It's important to note that the main crime here is for illegally accessing her virtual then-husband's account -- so the "primary" accusation is sound. But there are two interesting other elements to the crime and the reaction: first of all, while the primary accusation is set, there is speculation about whether additional accusations can be leveled for what the woman did once she actually accessed her virtual husband's account. Should killing a person's virtual avatar be treated similarly to destroying any other personal possession? An argument could certainly be made that breaking in and killing the man's avatar was tantamount to breaking another worldly possession -- but then comes the question of assigning a monetary value to the avatar, something that would clearly be very difficult to do. The other more interesting aspect of the case is the details of her incarceration -- she was transported 620 miles, clear from one tip of Japan to the other, to be held in the district of the victim. Should online crimes be subject to real-life jurisdictions? A similar case can be seen in the United States -- though absent the legal aspects of it, a woman caught her husband cheating with his Second Life avatar and went on to hire a Second Life detective to track details of her husband's Second Life avatar's life. These instances show a stunning intersect between the real world and the gaming world, especially when the emotions and conventions typically reserved for real life are transplanted into a non-real virtual world.
The top item on this list represents a truly unprecedented level of union between the real world and the gaming world. Released in 2002, America's Army was a recruitment initiative funded, developed and released by the United States Army. The intention of the game was quite simple: give prospective recruits a realistic glimpse of what life on the battlefield is really like, in order to prevent the applications of so many soldiers that were clearly unfit to serve. Considering that unlike nearly every other game, America's Army is not a means profit but rather to military information and recruitment, it contains one feature that largely distinguishes itself from the rest of first-person shooters: it's free. On a technical level, it can compare (though not compete) with the more advanced first-person shooters available today, but it costs nothing to obtain and play. As with pretty much anything the United States Army does, America's Army has come under a substantial amount of criticism as well -- much of it fully justified. While it is the stated intent of the game to give players a realistic impression of life in the Army, there are gameplay elements that make it extremely clear that the game is geared toward giving a favorable impression. Others criticize the game for focusing on the "how" of war, not the "why" of war, suggesting that the game is geared toward finding and recruiting those soldiers that will not question orders. And a vocal contingent also criticize the game for its appeal to a "vulnerable" audience that is more likely already to be swayed by gentle coercion, yet are themselves unfit to serve in the military. Despite these criticisms, however, America's Army has proven to be a critical success, both as a game and as a recruitment tool. It has won numerous awards and is easily recognized as one of the greatest free games available, and its frequent updates have won it awards for the depth and breadth of its available content. More importantly, though, is the success the system has met for the United States Army's purposes. Its intention, according to its developers, is to re-introduce a military career to America's youth as a viable career option as it once way -- not to coerce them or convince them to join, but rather to remind them of the option. Their in-house research indicates that 28% of the visitors to the game page also visit the recruitment page, indicating the game is serving its purpose of increasing the visibility of the army recruitment branch. But at the juncture of its success as a game and as a recruitment device is the one element both goals hinged on: its realism. According to many accounts, America's Army is one of the most realistic first-person shooters ever released. Its missions reflect realistic objectives; its gameplay reflects realistic knowledge of the environment; and its battle system reflects realistic responses to attacks. It also actively seeks to reflect the real values of the United States Army -- while some first-person shooters encourage selfish behavior, America's Army ranks players primarily based on Honor. Specific training elements in the game also reflect their real-world counterparts well, as multiple soldiers have cited the medical training they received in America's Army as the place where they learned skills they used in a real combat environment. Over time, America's Army has become less of a recruitment tool and more of a training tool, as its effective portrayal of real combat has value not only to potential soldiers, but to current ones as well, representing possibly the greatest intersect between video games and the real world that we've ever encountered.
There is no question that games have assumed a stronger and stronger role in every day life. Like movies and books, video games now receive the same media attention around their releases and inspire the same sort of real-world developments that their creative predecessors have long enjoyed. But there's something that separates video games from the rest of the pack in terms of their potential as a real-world device: their interactive nature. The interactive nature of video games means that with improvements in our own simulation and modeling abilities, they can more and more be used to accurately model real-world situations. In doing so, they allow the player to take a role that they could never take in real life, and learn the lessons that go along with it. The greatest example I've encountered comes from medical war training: it has been shown quite conclusively that individuals learn much better from experience than instruction. Unfortunately, however, usually that experience comes in situations where it would be preferable for the individual to already have the experience they're gaining -- for example, the only way to gain experience with emergency medical administration is to be in those situations, but in those situations when a person's life is on the line, you want someone that already had that experience. More and more, however, video games and their legitimized counterparts in the field of interactive narrative are supplying this experience when someone's life is not at risk. But that's only one example: more and more, people are meeting and forging actual relationships through online games. There's a resurgence of educational games' usage in schools on the horizon. Laws are increasingly taking into consideration the virtual counterparts to real-world crimes. With the trends that are taking place now, we're increasingly on a course that will lead to a culture where gaming is not simply an industry, but is a cornerstone of everyday life.
List by DDJGames (08/28/2009)
Discuss this list and others on the Top 10 Lists board.