Technological Amnesia and Digital Ghosts
This is a post that I wrote back in the fall of 2014 for The Critical Is. When setting up this blog, this felt like as good a post as any to port over, given that 1) I’m proud of it and 2) the issues it raises have not gotten any better in the intervening year.
When old technology all but disappears in the face of the new, how do we value remembering?
The first few weeks of September, 2014 were a hard couple of weeks to play video games. Specifically, to play video games from any cultural position other than that of a straight, white, young, middle-class male. On some level, we know this already, and while I wish I could say that I’ve been surprised by the rampant, unmitigated vitriol, no longer “trending towards” but rather actively resorting to violent threats, against Anita Sarkeesian since she posted the most recent video in her Tropes vs. Women in Video Games series, against female developers and critics guilty of no crimes save daring to exist, let alone work in the games industry— but the sound has turned to noise. I remain disgusted, appalled, and enraged, but not surprised: we all know too well by now the history of a particular subset of gaming culture’s hostility to any perceived attacks on its “territory.” Even when these attacks are nothing more than daring to be. Prominent women, LGBT, or non-white developers and critics have been quite literally forced from their homes in online attacks that, in their organization, militancy, and ideology can only be described (I wish I were being hysterical, but I am not) as terrorism: this is, horrifically, business as usual.
There are a number of ways to read these fraught, terrifying moments, and I’m thankful that gaming journalism, journalism in general, and individual bloggers have stepped up over the past few weeks to provide numerous sane, intelligent, and necessarily excoriating takes. I’ll link to some here, and more at the bottom of this essay. The bulk of these pieces have focused on the cultural side of things: Leigh Alexander, in Gamasutra, writes of how the sociological classification of “gamers” is dying off; in Buzzfeed, Joseph Bernstein maps the synecdochic fallacy of “gamers” onto cinema; Ian Steadman in The New Statesman eviscerates the supposed “critique” of Sarkeesian’s work from her violent detractors as “anti-feminist nonsense.” In essence, each essay describes the same anthropological moment: a subculture, confronted with obsolence, lashes out. It’s primal—a battle over perceived territory and relevance aided by the anonymity of social media and abetted by a long history of companies prizing or denigrating communities in the marketplace. “Real gamers are X,” went the marketing canard for well over a decade, “and never Y.” This historical relic collapses, as each of the aforementioned essays shows, in the face of today’s wide(r) variety of games and gamers. “Games,” as a monolithic cultural category encompassing only a narrow slice of experiences, fall apart; the Call of Duty center cannot hold.
I’m loath to simply this trajectory, though, or to frame it as inevitable that the stereotypical “gamers” (and we need a better name for this subculture, given that, as Bernstein shows, “gamers” is as specific and critically useful as “moviegoers”) will disappear, or assimilate peacefully. To do so not only obscures the fact htat the ongoing abuse and harrasment of subaltern voices in the games community will not simply stop over time, but requires, like any other attempt at social justice, solidarity, vigilance, and action. I’m worried, as well, about the ways that the current technological and economic superstructures of capital-VG Video Gaming as a medium actively restrict our collective abilities to maintain a kind of critical historical memory that can inform and sustain us against future injustice. We shouldn’t seek to overcome the strictures of our pasts only to turn around and erase them from our memory — and I’m concerned that the games industry itself (like any technology industry) is invested, has always been invested, on the level of hardware and code, in this kind of forgetting. Furthermore, our attempts at engaging with video games in thoughtful, critical, and reflexive ways—the kinds of ways that Sarkeesian seeks to engage with games—will continue to come up short when faced with these kinds of encoded amnesia.
By “forgetting,” in this context, I want to address what I see as twinned issues: first, the idea of an archive as central to any critical apparatus; and second, the ways that the games industry actively substitutes nostalgia for historicity, and in doing so, encourages the existence of the idea of a subculture of “gamers” (which, as these tweets from games developer Patrick Lindsey point out, with no small amount of invective, that “gamer” is first and foremost a corporate identity) that perpetuates, on some level, this violence. These are broad issues that I’m interested in teasing out over future blog posts, so in the spirit of concision, I’d like to explore some of the technological and material issues at play here, and see how they link up to the economic and cultural apparatuses surrounding them.
First, let’s talk about backwards compatibility.
The games industry, like the technology industry as a whole, thrives on the idea of planned obsolescence. Apple is its king: it strategically makes key features exclusive to each subsequent iteration of iDevice in order to encourage you to upgrade as frequently as possible. In the games industry (and here I’m restricting my view, for a moment, to consoles), obsolescence works very simply: with each new console generation (Super Nintendo to Nintendo 64; Playstation 3 to PS4), you generally have to buy a whole new crop of games, and bury your old ones in the closet. Frequently, new games will be on entirely different physical formats — Blu Ray discs instead of DVD; radically different shaped cartridges — that send the obvious message, “You can’t fit us into that new box.” It works on the level of software, too, through iterative series like Mario Kart or FIFA. With each passing year, we usher in the new and force out the old: this, we have accepted, is the tidal churn of progress. You can see this progress, as a consumer, in the physical shape and composition of the media containers themselves. To take the progression of my own childhood into account, there was something compelling about seeing the boxy edges of the Super Nintendo’s cartridges round off for the N64, before eventually moving to discs (a real sign of technological progress — the kids across the street, who were a Playstation family, had had discs for years!). The physical media educates the consumer to relate to her games as ephemeral, as soon-to-be-superceded, as imminently obsolete.
Backwards compatibility works against this churn. It’s the basic idea that the software of the past be executable by the hardware of the present. It’s the expectation that my less-powerful games that I bought last year can be played on my more-powerful console I buy next year. It makes logical sense: obviously, I can’t play new games on old hardware because of Progress, but why can’t I play old games on my new, shiny system? After all, I paid for them?
Unsurprisingly, the console industry isn’t a huge fan, given that planned obsolescence can make a positive impact on revenue. Moreover, given how (and Hollywood has co-opted this view in recent years) major publishers’ profits in recent years are dependent on major tent-pole releases in “evergreen” series like Call of Duty, Grand Theft Auto, or Assassin’s Creed, there’s a present drive to convince the consumer that she needs to purchase the newest iteration, that all of the old iterations were incomplete compared this, this newest one. For these reasons and more, backwards compatibility has been traditionally altogether absent on consoles, although it thrived on Nintendo’s handheld Gameboy and DS platforms, where the previous generation of games was always playable on the current hardware (one imagines for marketing reasons as much as Nintendo’s particular niche in the industry as a purveryor of nostalgia), though revisions of each piece of hardware eventually excised these features as well, ostensibly in the name of portability.
There’s a two-fold effect this had on me as a child, and, honestly, continues to have on me as an adult. The first is a kind of frustration coupled with resignation coupled with consumerist delight as I swipe my card every four years or so — a familiar feeling, now, to anyone who owns a smartphone. In a way, this feeling of being forced, by the hand of industry, to dispose of last year’s possessions has been normalized by Apple, Samsung, et al. — as video games have reached a broader audience, frequently through the smartphone as a platform, so too have its industry’s obsolescence strategies.
The second is an urge to collect, to store, to have a physical memory of systems past. I’ve made it a point to hold onto most of the systems and games of my childhood, for two reasons: first, because I still find the games fun to play; and second, because of a twinned feeling of nostalgia for the past and anxiety that the past will someday disappear, that the software will stop running, that the hardware will break down, and I’ll be left with the shell of a talisman, unable to summon the spirit. This is already happening to some of my games. Pokémon fans of a certain age and a certain temperament will recognize the moment you put in an old cartridge, say, from 1998 or so, and found that the game would run fine, but would not save, because the small battery burrowed away in each cartridge had finally run out. You can replace that battery, sure, with solder and gumption, but it’s a concrete object lesson in the impermanence and entropy of all physical systems. How long until the battery runs out again? How long will you keep replacing it? When do you decide your memory is enough?
One can understand, on some level, the territoriality this inspires in fans of this, or perhaps any computational media. It evokes a fascination with a very specific kind of past — imagined to be your own past — and fetishizes it as imminently on the verge of erasure. Because, in the context of the market, it is. Games publishers have begun to realize this in recent years, and have increasingly made very specific games from the past available for contemporary consumers, generally in the form of an HD re-release (akin to a remaster, and always changing some key pieces of the game, whether upgrading graphics or changing key gameplay) or, as Nintendo and Sony have done with the Virtual Console and Playstation Plus, respectively, developing digital store-fronts through which consumers can (re)purchase old games to play in emulation on their current systems. (No word, though, on whether or not these games will be portable to future systems, or how many times consumers will have to re-purchase what is in essence their own, personal archive.) In independent games development, “retro” has become an aesthetic, code for punishingly difficult platformers with blocky, 8-bit graphics. It’s the memory of a very particular demographic, useful to those selling the games, but not particularly useful to those playing them, or writing about them, or even making them. All it does is allow the monolithic corporations of the games industry — not even the smaller developers on the ground, many of whom are doing genuinely interesting, innovative work — to control the terms of their own histories, to map the terrain of our memories.
The obvious answer might be developing robust games archives.
This is not a revolutionary claim by any means, and many spaces within and outside academia, like the NYU Game Center, the MoMA’s new video games collection, UC-Boulder’s Media Archaeology Lab, the Internet Archive’s Console Living Room and the JSMESS emulator are taking incredible steps towards a future of critical gaming. The archive is an attractive solution, particularly for games and technology more generally, given the complex technical and logistical requirements of born-digital media preservation. It’s the uphill battle of digital media, in that there’s always more and more stuff, and it all requires exacting, particular systems to get it right, whether its figuring out the logistics of a collection of physical hardware and media, or working out the precise systems that go into emulating software. None of what I’ve pointed out above begins to grapple with the large community of indie developers working on platforms like Twine, or what to do with all of the games from the 2000s that were written in soon-to-be-obsolete Flash. (Although, for thoughts on that issue, you might want to check out Zach Whalen’s talk from MLA 2013 on the seminal, though now un-executable, piece of electronic literature, Talan Memmott’s Lexia to Perplexia.) The attractiveness of “build archives” as a solution is that it makes all of these questions, on some level, somebody’s else’s problem — someone with The Skills, The Knowledge, and, most importantly, The Time. We need to reconfigure (or at least, take a fresh look at) our relationship to electronic media, to the open-source and proprietary systems we use to create and execute that media, and to our relationship with the larger economic interests that, at practically every level, manage and condition these relationships. I’m not calling for the End Of All Games Corporations. I might be calling for the End Of Games Corporations As Their Own Historians, though.
There’s something insidious about asking (allowing!) games publishers to be their own archivists. Doing so denies players and critics access to a more-complete picture of gaming history. Like so many of the larger questions of sexism in the technology industry (think of Ubisoft’s claim that it couldn’t find resources to program and design a female protagonist in the upcoming Assassin’s Creed: Unity because of budgetary limitations), it cloaks a sociological problem as an engineering problem, as if framing it all in questions of numbers and code would absolve publishers of a humanistic, moral, and cultural reckoning.
This post’s title is partially misleading. There is nothing, arguably, about the medium of gaming (understood as an abstraction) that predestines historical amnesia. We don’t have to have computing systems, on some level, predicated on erasing the past with each forward step — and turning on that erasure more and more frequently with each passing year. Maybe that would slow down progress, or make innovation more difficult. I don’t pretend to have the breadth of technological knowledge and experience necessary to determine that. But looking at the past few weeks (months, years, decades) of rage, violence, and injustice, I don’t know if slowing down and cultivating a better sense of the fullness of our past would be such a terrible thing.