Bubble Bobble
Hollywood has had mixed success...making movies based on computer games.And even less by making computer games based on movies.
But that has not stopped directors, actors, writers and others in the movie business from enviously eyeing the video game business in hopes of tapping some of its energy and riches for themselves. Computer games represent one of the fastest-growing, most profitable entertainment businesses. Making movies, by contrast, is getting tougher and more expensive, now costing, with marketing fees, an average of $103 million a film. That is one reason, among others, that those with power in Hollywood are avidly seeking to get into the game business while also reshaping standard movie contracts so they can grab a personal share of game rights....Hollywood has developed a free agency system in which the most famous and powerful actors and directors routinely demand millions of dollars and receive lavish perks for lending their talent and name to a project....Hollywood may...have an exaggerated sense of the riches to be made from video games. For one, costs will rise as more directors try to get a piece of the action.Am I the only one who smells a bubble here? Hollywood pursuing potential profits in the computer games industry, the computer games industry welcoming an inrush of capital from Hollywood's deep pockets, Hollywood realizing that there's less (maybe much less) money to be made in computer games than they had hoped and divesting, the sudden outflow of capital from previously swollen coffers imploding the computer games industry. Is it just me?
The history of movie/game "synergy"* only lends credibility to my apprehension. By the early 1990s, desktop computers had grown sufficiently in power to accommodate full motion video and digitized sound. Game makers turned away from computer-generated characters to the use of real actors. The computer games industry touted its new "interactive movies" and spoke of synergy with Hollywood for the first time. The "interactive movie" model was undone by the use of mostly bad (bad, bad, bad, bad) actors and the increasing versatility and power of entirely computer-generated graphics. Only digitized audio survived the "interactive movie" debacle through the continued use of recorded sound effects and voice talent (though to this day "voice talent" often means a warm body grabbed from the nearest cubicle).
"Tomb Raider" was not the first game to be made into a movie (I believe that dubious honor belongs to "Super Mario Bros."[shudder]) but it was the most egregious example of what happens when a game is perverted into a marketing phenomenon and Exhibit A in the case for building a Chinese wall between Hollywood and the computer games industry. "Tomb Raider" began as an innovatively third-person perspective platform game starring a computer-generated heroine with massive norks and an ass you could bounce a quarter off of. An outside non-gaming observer might conclude that the reason for the success of "Tomb Raider" was the sexualization of its heroine Lara Croft. Madison Avenue certainly thought so and jumped on the novelty of a computer-generated It Girl. Core Design, the developer behind "Tomb Raider," was not about to turn down free publicity. Periodicals as diverse as the Financial Times and The Face Magazine featured Lara on their covers. (It was a Financial Times front page image of Lara that kicked off the whole phenomenon.) "Tomb Raider" publisher Eidos Interactive, parent company of Core Design, shifted gears to strike while the iron was hot. Eidos published a "Tomb Raider" game every Christmas season from the 1996 (the original "Tomb Raider") to 2000 ("Tomb Raider: Chronicles"). Computer games are technology in the service of aesthetics; a business model forced to knock out sequels every Christmas is not conducive to full and proper development of either. Quality deteriorated as a result, followed soon after by sales. There was a two-and-a-half year gap before the release of "Tomb Raider: Angel of Darkness." Unfortunately, the game was compromised yet again to meet a deadline imposed by outside considerations, this time the release of the "Lara Croft Tomb Raider: The Cradle of Life" film.
No single factor has consistently caused more bad games than rushing development to meet the Christmas season or a movie release date. Exhibit B: "Enter the Matrix." The Wachowski brothers deserve credit for their ambition. Rather than having the game follow the plot of the movie, they instead used the opportunity to tell a parallel story contemporaneous with the events of the film: "Rosencrantz and Guildernstern Are Dead" as a computer game. "Enter the Matrix" contains cutscenes that expand on the story told in the films. Not only do cutscenes add variety to the narrative of a game, they also serve as a reward for progress. In the case of "Enter the Matrix," part of the motivation to continue playing is that the cutscenes will make the horrible gameplay stop. For a game set in a world rich with visual possibilities, the graphics of "Enter the Matrix" surprisingly lack texture, a result of being released prematurely to meet the opening date of "The Matrix Reloaded." "Enter the Matrix" is successful experimental storytelling experiment and terrible, terrible gaming.
In the end, it is neither Hollywood's money nor its dilletantism that is the real danger to the gaming industry. The danger is posed by the strong likelihood that, as the two industries increase their mutual involvement, Hollywood will use its greater financial clout to impose its business model on an industry with which it is incompatible. The logistics of filmmaking are fundamentally different from the economics of game making. In order to stay competitive, game makers are obliged to keep up with the latest technology in a way that film makers are not. Games typically take a couple of years from inception to release. Technological changes during that time could well force game makers to start over from scratch more than once or risk releasing a game that will not sell because it is based on outdated technology. Though some films take years to complete, most take only months, of which the most expensive phase, shooting (unless there are extensive postproduction special effects) takes only a fraction. Filmmakers have a pretty good idea of what their films will ultimately look like in the editing room. Games must undergo repeated rounds of playtesting to discover bugs and gameplay issues that would otherwise compromise the final playing experience.
Even studios, which have long regarded video games as just another licensing opportunity like coffee mugs and T-shirts, are looking for an opening into the business....Within the movie business, even the most enthusiastic video game supporters warn that Hollywood will succeed in making games only if it sheds some of its own hubris. "No one is going to make any money, no matter how famous," Cody Alexander, an agent at William Morris, said, "unless they make a game that people want to play over and over and over again."If Hollywood wants to make money off of computer games, it must recognize that computer games are not merely a licensing opportunity. Making a quality game demands infinitely more effort than stamping a movie's logo on to a mug or a t-shirt; making a quality game demands a level of effort comparable to that of making a film, possibly more. One could form a small mountain with all of the sh*tty movie tie-in games released over the years. At its apex would be "E.T.: The Extra-Terrestrial" for the Atari 2600, which bears the distinction of being both a cheap movie cash-in as well as quickly knocked out in time for the 1982 Christmas season, a game that is still remembered over twenty years later for its sheer awfulness. Time is a critical ingredient of quality, perhaps even more so for games than for movies. "Goldeneye" for the Nintendo 64 was released in 1997, two years after its movie namesake; it became a bestseller nonetheless because of the crack-like addictiveness of its gameplay. Movie-based games are often too much "movie," not enough "game;" games succeed when the play to the strengths of the medium. The Star Wars game "Knights of the Old Republic" is a case in point. The player is repeatedly faced with choices whose moral consequences affect the outcome of the game. Many Star Wars enthusiasts even feel that "Knights of the Old Republic" is a more worthy contribution to the Star Wars legacy than the recent prequels. George Lucas the filmmaker may have gotten rusty in the sixteen years between "Return of the Jedi" and "Phantom Menace" but George Lucas the game publisher has been releasing great games since 1982.
*Excuse me, but "synergy?" Isn't that just a buzzword that dumb people use to sound important?
UPDATE: The Guardian's caught on to this trend as well. The Guardian placed a greater emphasis on history, mentioning the recent "Everything or Nothing" (which I neglected to), LucasArts and "Knights of the Old Republic" (noting as I did the stark contrast between the superlative excellence of KOTOR and the crapulence of the prequels), "Enter the Matrix" (though they ignored its poor reception among longtime gamers) and the seminal place of "GoldenEye" as one of the first, if not the first, really good movie tie-in games, though the Times placed the phenomenon into economic context more clearly.
UPDATE: Interesting...another case of where the New York Times and the Guardian publish similar articles within days of each other. Hmmm...
<< Home