top of page

Saving (and Re-Saving) Videogames:

Rethinking Emulation for Preservation, Exhibition and Interpretation


DOI: https://doi.org/10.33008/IJCMR.2019.08 | Issue 1 | March 2019

Author: James Newman, Bath Spa University

Abstract

The rapidly growing number of museums, archives and galleries turning their attentions to videogames reveals both the level of public and scholarly interest in the histories of design and play and, rather more worryingly, that we currently possess extremely limited methods for exhibiting, interpreting and accessing games and gameplay. Strategies based around the collection, maintenance and use of original hardware and software satisfy purists in search of authenticity but are necessarily short-term as technological obsolescence and media deterioration inevitably render systems and games unusable. As such, for most game preservation practitioners, the only viable long-term approaches make use of emulation – that is the creation of new software environments that mimic the operation of obsolete hardware and allow old games to be run on new computing platforms thereby sidestepping the need to keep old systems up and running. However, where most critical commentary on emulation centres on evaluating the accuracy of audiovisual reproduction, here I argue that the fixation with authenticity and the fetishisation of the recreation of the original experience leaves the interpretative potential of emulation untapped. Accordingly, in this article I propose a rethinking of the preservation and interpretation functions of emulation. This approach actively embraces the transformative nature of emulation and makes positive play of the innovative ways in which games and gameplay can be altered. Focusing in particular on the unique affordance of the ‘savestate’ that allows games to be arbitrarily suspended and recalled, this article explores how a reconceptualisation of emulation offers curators, scholars and players methods and tools for interpreting, accessing and navigating gameplay in ways hitherto impossible.

Introduction: Videogames in the Gallery


Over the past decade, a growing number of museums and galleries have turned their attentions to videogames as part of their collecting and exhibition activities. Organisations such as ICHEG at the Strong National Museum of Play in Rochester, NY are dedicated to documenting the histories of gaming in both analogue and digital forms in the broader context of play, while European Federation of Games Archives, Museums and Preservation Projects (EFGAMP) members such as the Computerspielemuseum in Berlin, Vigamus in Rome, along with the Nexon Computer Museum in South Korea and the Museum of Soviet Arcade Machines are dedicated more squarely to the domain of digital gameplay and the industries and technologies of gaming. In the UK alone, public-facing exhibitions of digital games available in 2018/19 include those at the National Videogame Museum (NVM) in Sheffield and the V&A’s Videogames: Design/Play/Disrupt, while the Barbican’s Game On/Game On 2.0, like the Australian Centre for the Moving Image’s Game Masters, continue to tour demonstrating the global audience for game history.


However, curators and exhibition designers face a number of significant challenges when considering how videogames could and should be displayed for both expert and non-expert audiences. In this article, I wish to share some of the perspectives that underpin and emerge from my work as a researcher and curator at the UK’s National Videogame Museum (NVM); and to report on some of the practical exhibit designs I am currently undertaking as part of the redevelopment of the galleries following the NVM’s relocation to new premises in Sheffield in late 2018. In exploring these links between critical game studies scholarship and creative and curatorial practice, my principal aim here is to propose a rethinking of the uses and utility of software emulation strategies in the contexts of videogame preservation, interpretation and exhibition. My central thesis in this article and in my ongoing curatorial work is that, by fixating on questions of aesthetic and technical authenticity and the slavish reproduction of an often illusory original experience, both digital game studies scholars and museum practitioners alike are presently failing to maximally exploit the affordances of the rich interpretative tools at their disposal. It is for this reason, I argue, that current videogame exhibition practice is unambitious and game studies and game history scholarship more broadly remains in search of an effective means of describing, sharing and citing the moments of gameplay that constitute the objects of its enquiry.


What I advocate in this article is a fundamental shift in thinking that asks not how well contemporary software environments or applications are able to emulate the audiovisual, tactile or haptic elements of obsolete hardware, but rather how these tools can provide new ways of accessing games and gameplay so that they might be interrogated, interpreted and studied in novel and innovative ways. Instead of assessing the putative authenticity of reproduction, which seems to me a very limited and limiting approach to emulation, I am interesting in exploring what we are able to do with videogames running under emulation that we could not when they ran on their original hardware. How, then, do the new affordances that are revealed when games are run in contemporary emulation environments allow us to experience in ways previously impossible or in manners unintended in the original implementations? How does emulation help us provide accessible means of play that are otherwise available only to the adept, proficient player? In short, I wish to move our shared discussion away from encountering and conceiving of videogame emulation tools as software simulacra. I wish to shift away from the obsessive concern for the details of authenticity in reconstruction and the operation of code, graphics and sound chips that currently dominate discussions. I wish, by contrast, to move to a position that champions the experiential and interactive differences that arise when games are replayed under modern emulation frameworks and to consider what opportunities these bring. It is these vital areas of divergence from the original operation of the codebase that are most exciting and profitable for scholars, curators and players of the future.


Ultimately, my position is a simple one that proceeds from the acceptance that emulation is necessarily transformative and that, accordingly, the search for the original, authentic experience leads us to an experiential and technological dead end. However, by embracing rather than seeking to ignore or conceal this transformativity, we free ourselves conceptually and practically to harness contemporary emulators as a suite of interpretative tools that allow us to creatively (re)configure videogames as malleable material for exhibition, study and play.


To make this case, I will focus on one seemingly minor functional aspect of the contemporary videogame emulator – the provision of arbitrary savestates. Ostensibly existing to allow the interruption and resumption of gameplay at any point, this feature is immeasurably creatively disruptive. By allowing the freezing of the game state at any point rather than merely those points available in the original game design, we are afforded extraordinary opportunities to explore game structure, non-linearity, the operation (and erroneous execution) of code, and to make available ways of playing that would remain unavailable to only the cognoscenti.


Preserving Playability


Given its apparent centrality in defining the videogame form, it is hardly surprising that extant preservation and exhibition scholarship and practice has largely centred on providing environments that maintain the ability to play. From the earliest days of game studies scholarship, play has been conspicuously privileged as the means to attain understanding of videogames. Whether through the implicit criticisms of observational or textual studies offered in Jenkins (1993) and Kinder’s (1991) early work or the more direct methodological imperatives presented by Aarseth (2003) that place play and the acquisition of repertoires of gaming expertise at the centre of practice, the importance of playing as an interpretative and investigative practice is so engrained as to be almost taken-for-granted. More recently, Consalvo et al (2013: 4) similarly recognise what has been variously called the ‘configurative’ (Moulthrop 2004) or co-creative (Banks 2013) nature of play, in noting that, ‘...although audiences of all types are “active” in the many ways delineated by Fiske (1987), players – or perhaps the play position – is unique in that the player must work to (co-)construct the object of interest – the videogame.’ Indeed, we might even go as far as to argue that the field of game studies has defined itself through its position of players, their experiences and the act of play at its heart.


If games must be played to be understood, as the oft-heard mantra has it, then (re)creating the conditions in which games may continue to be accessed and encountered in as close to their original state as possible is, surely, the ultimate goal of preservation. As such, preserving playability has become the watchword of game preservation practice so that future scholars and developers might be able to unlock understanding through interaction and museum visitors might be able to experience at first hand the look and feel of historic gameplay otherwise unavailable to them. However, this is not an insignificant task and not one simply solved by accumulating and maintaining vintage hardware collections. The ability to play in the long-term is irrevocably compromised as technologies and systems are rendered obsolete, unsupported and unusable. The inevitable loss of hardware platforms means that new strategies are required. As Kaltman (2016) of Stanford University’s ‘How they got game’ project observes:


We are producing objects that are getting more technologically complex, more interdependent, and less accessible. And we are producing them at a rate that dwarfs their previous historical outputs, and that will terminally outpace future preservation efforts (Kaltman, 2016).

While collections of hardware and software are held by many of the organisations listed above and, in some cases, are used within the handing collections for exhibition and research access, for most videogame preservation and exhibition practitioners, the use of alternatives to original hardware and software proves essential. Indeed, for many practitioners and theorists, such as Monnens (2009: 6) emulation is ‘the only currently viable alternative strategy’ to provide much-vaunted long-term playability (see also Delve and Anderson, 2014).


At its most basic, an emulator is an application running on one system that mimics the behaviours and capabilities of another. It essentially provides compatibility between two platforms that are otherwise incompatible (see Rosenthal, 2015). Software originally designed and written for the emulated system may be run on the new system in unmodified form without any further need for the original hardware. By way of example, the Stella emulator replicates the functionality of the 1977 Atari VCS and allows game data extracted from a 1978 Space Invaders cartridge to be run on a modern PC or Mac (or even Raspberry Pi).


Emulators exist for a great many console, handheld and arcade platforms and work continues apace to emulate even comparatively recent gaming systems such as the Nintendo Wii U (discontinued in 2017) and its successor the Nintendo Switch (see Evangelho, 2018). The power of modern computing platforms and sophistication of emulation software means that many early games can even be played in a web browser: the Internet Arcade has nearly 2000 arcade games from the 1970s-1990s that are playable via emulators online. Indeed, such is the scope of videogame emulation endeavour, projects such as MAME (Multiple Arcade Machine Emulator), RetroPie and OpenEmu seek to further ease the user experience by gathering together multiple emulation cores and frameworks under a single unified (graphical) user interface. In addition to their user-friendliness, by helping the user manage multiple emulation applications as well as libraries of playable game files, these tools ably demonstrate the crucial way in which emulation breaks the intimate interdependency between gaming hardware and software (see Aravani, 2016). Where once every aspect of the look, sound and feel of a videogame was the result of a direct interaction between code, data and the affordances of the unique combination and capability of graphics, sound and general processing chips, under emulation the general purpose computer flattens these distinctive qualities. In this way, emulation reconfigures the general purpose PC as the ultimate videogaming metaplatform.


The (Usual) Trouble with Emulation


Of course, if all of this seems too good to be true, to some degree at least, it most definitely is and emulation raises a number of significant and, in some cases, as yet unanswered questions. I have noted elsewhere (e.g. Newman, 2018), for instance, that the provision of emulation is extremely unbalanced with some systems enjoying ample coverage and others none at all. As Conley et al (2004: 5) observe, ‘demand for a suitable emulator for a game system correlates directly with the popularity of a video game console when it was available in the retail marketplace.’ As such, access is available only to those systems with a sufficient level of support to warrant and sustain development. The Atari VCS, like other well-known, commercially successful consoles with substantial game libraries such as the Nintendo NES/Famicom and Sega Mega Drive, are well covered. In fact, many of these systems have multiple emulators actively in development but what of more obscure or less commercially successful systems like the Japanese-only FM Towns Marty, the self-published, independently-developed Flash games, or shareware and titles cancelled prior to release that Vowell (2009) notes? While the range of systems available under emulation is impressive in one sense, the necessarily limited nature of coverage and the implicit importance of a substantive base of documentation and user interest must surely remind us of Apperley and Parikka’s (2018) criticism of platform studies and its ‘epistemic threshold’. Just as the constitution of a ‘platform’ is dependent on the existence of an archive which is dependent on a certain critical mass as well as a desire to collect and collate materials, that which is emulated is, to some degree at least, that which is already agreed significant and viable. Indeed, the creation of emulation software must surely be seen to be a key element in the constitutive practices of platform-making.


Moreover, there is an almost unbreakable connection between emulation and software piracy. Space does not permit a detailed discussion of these complex issues yet it is useful to note that, although the creation of emulators is generally considered permissible as an act of research, the acquisition of copy protected data remains unequivocally illegal (see Zainzinger, 2012). In other words, creating emulators such as Stella or NESticle to mimic the behaviours of the Atari VCS or Nintendo NES respectively is considered permissible as a technical enquiry, ripping the code from a Space Invaders or Super Mario Bros. cartridge to play on those emulators most certainly is not. As Good (2018) and Maiberg (2018) have noted, Nintendo have taken action against a number of filesharing websites throughout 2018 to limit the distribution of such files and the company publishes extensive support documentation on its corporate website outlining its case, linking emulation and the distribution of illegally obtained software, and the damage this does to the industry, creators and consumers. It is important to note that the same restrictions that affect end-users also presently affect those working in the heritage sector as few exemptions exist (though see the US Copyright Office, Library of Congress (2018) ruling on exemptions granted in the US). Elsewhere, as EFGAMP notes:


Furthermore games are considered to be special subject matter under copyright law. Games are hybrid works, i.e. they consist of software, audio-visual elements and sometimes additional subject matter like databases. Even now it is not clear whether their protection and use is governed by the general copyright rules (in Directive 2001/29 and its corresponding national implementations in particular), the rules about computer programs (in Directive 2009/24) or both. This makes legal assessments tricky, especially those about use under exceptions (like archiving) or when the game is – as is often the case – protected by TPM [Technical Protection Measure] (EFGAMP, 2017).

However, crucial though these issues of coverage and legality are, it remains the case that the majority of debate and criticism of emulation in scholarly circles, as well as among players, is reserved for the matter of authenticity – or rather the detailed analysis of the lack thereof. Quite simply, the creation of videogame emulators is extremely technically demanding. Perhaps not unrelated to its action against sites plundering and sharing its games, over the last couple of years Nintendo has released recreations of its 1980/90s hardware. The modern, miniaturised iterations of the Nintendo Entertainment System/Famicom and Super Nintendo Entertainment System/Super Famicom not only carefully select (and selectively forget) from the back catalogue thereby continuing to explicitly define the playable canon, but most importantly, include none of the original componentry of their predecessors. These are essentially NES/SNES shaped cases that contain modern embedded computing systems running NES/SNES emulators and game data. And demonstrating how significant the challenge of accurate emulation truly is, even Nintendo’s efforts to recreate its own systems are subject to considerable criticism especially over the quality of graphics and sound reproduction (see Richretro, 2017; Ciolek, 2016; List, 2016; Nerdly Pleasures, 2016; Great Hierophant, 2016; Linneman, 2017). In fact, such is the difficulty that it accompanied some of its earlier efforts at re-releasing games with caveats noting:


This NINTENDO GAMECUBE software is a collection of titles originally developed for other Nintendo Systems. Because of the process of transferring software from Game Paks to a Game Disc, you may experience slight sound irregularities or brief pauses during which the system loads data from the Game Disc. Such instances are normal and do not indicate defective software or hardware (Nintendo, 2003: 3).

With emulator authors often concerned with speed of performance and with providing compatibility with the widest possible range of the most popular titles for the target platform, hacks and tweaks to code are commonplace in order that known problems and inaccuracies can be routed around and compensated for. As byuu the author of bsnes, an unofficial SNES/Super Famicom emulator, explains:


What typically happens is that the problems are specifically hacked around. Both ZSNES and Snes9X contain internal lists of the most popular fifty or so games. When you load those games, the emulators tweak their timing values and patch out certain areas of code to get these games running (byuu, 2011).

It is easy to overlook the magnitude of the technical demands when faced with emulations of current console platforms such as the Nintendo Switch running under emulation on the PC (see Evangelho 2018) or when confronted with the Internet Arcade’s emulators running in HTML5 in a web browser. Yet, it remains the case that emulation is presently far from perfected even for comparatively old systems with well-documented components and subsystems. As Guttenbrunner et al (2010: 86) note:


Even popular systems of the first four generations are not perfectly emulated today. The more recent the system, the lower the degree of accuracy. Of two tested games on two emulators for the Atari Jaguar only one game was playable. The two games for the Sony PlayStation 2 proved entirely unplayable.

The imprecision of emulation whether judged against the ‘original’ hardware reference (see McDonough et al, 2010; Oakvalley, 2015) or between different emulators has also been ably explored by game historians such as Altice who even goes as far as to suggest the need for incorporating reference to the specific emulator in the citation for videogames:


Since emulators vary widely in accuracy, the emulator listing provides the reader with information about how the author viewed the particular file. If NESticle is listed rather than Nintendulator, for instance, the reader will know that the file’s raster effects, sound, or palettes may have been emulated improperly (Altice, 2015: 341).

Setting aside discussions of the often illusory nature of ‘original’ hardware reference points where audio and video signals are mediated by visual and auditory displays of wide and unpredictable variation (see Newman 2018), it is important to note just how much of the discussion, critique and creative endeavour surrounding videogame emulation centres on the assessment and development of authenticity. Notwithstanding Swalwell’s (2017) discussions of the need to move beyond the fetishised ‘original experience’, the far-reaching desire to deliver long term playability for researchers and museum visitors continues to see emulation take centre stage. As Foteini Aravani, curator at the Museum of London, explains:


…we extract the source code of the game, and run it on a small, very simple computer called a Raspberry Pi, but we keep all the original devices. With the games originally released for the ZX Spectrum, you play the game with the Spectrum keyboard (Aravani, 2016).

I have written elsewhere (e.g. Newman, 2012) on the alternatives to preserving playability, on the utility of documentary approaches that seek to reconceive play as part of the object of preservation rather than its outcome, and on practical examples of the implementation of these strategies in the form of the NVM’s ‘Game Inspector’ exhibits (see Newman, 2018). Here, though, I wish to set aside those arguments and return to a consideration of how playability might form the basis of a preservation and exhibition practice and, in particular, how we might reframe our uses of emulation to provide new ways of accessing and interpreting – and perhaps also citing – gameplay.


The discussions of emulation above, whether coming from preservation practitioners, game studies scholars or from players and developers, centre on the ability of the new software system to replicate the look, feel and sound of otherwise inaccessible or obsolete hardware. Indeed, such analyses are often performed in painstakingly minute detail. Altice’s commitment to recognising and foregrounding the differences between the rendering of raster graphics or Oakvalley’s (2015a) interrogation of the specifics of the Commodore 64 soundchip’s non-linear distortion, clearly elucidate a level of analytical depth that, if nothing else, eloquently speaks to the maturity of game studies as a discipline. And yet, while this level of detail in the emulation of the NES’ graphics or Commodore 64’s SID chip in contemporary music players and plugins (see Newman 2017), might benefit particular audiences for preservation, the almost single-minded focus on authenticity blinds us to the quite transformative affordances of emulation for interpretation and access.


The Transformativity of the Savestate


One area in which almost all contemporary emulators differ from the original hardware and software they replicate is in the provision of arbitrary saving. At any point during play, the full state of the emulated system may be frozen and stored. The resulting snapshots of every aspect of the game’s hardware and software status may then be reloaded with gameplay picking up from the exact same moment. And ‘exact’ is precisely what it is. The reloaded savestate continues from precisely the same frame, the same processor clock cycle, with data loaded in precisely the way it was when play was suspended. But, surely it has always been possible to save one’s progress in videogames? In fact, that is only partly true and, even then, it is very seldom the case that the precise state of the entire system is saved in this manner. And it is this that makes the emulator savestate so utterly transformational. In order to explain, a very brief survey of saving videogames is required.


In the first instance, a great many videogames offer no ability to save progress whatsoever. Though they are far from the only cases, the overwhelming majority of coin-operated arcade games fall into this category. When all of their lives are expended or they have reached the infamous ‘killscreen’ (see Newman, 2016), players have two options. They may walk away feeling dejected or euphoric depending on their performance, or they may deposit another coin to play again. One thing they have precisely no control over, however, is from where their gameplay will start. That answer is hard coded into the game design and play always starts from level 1. It matters not whether the previous player reached level 10, 100 or 256, the next game begins from the beginning. Some coin-op games, particularly those developed by SNK in their 1990s NEO•GEO series allowed players to save their progress and transfer data between home and arcade but this was an expensive proposition and a comparatively rarely implemented one at that. The business model of the commercial arcade strongly implies gameplay that is time-limited (three laps around a racing circuit or timed bouts of combat that last no more than 90 second each) or that are subject to an exponentially increasing difficulty that ensures short game sessions even for seasoned players so as to maximise the throughput of players (for which, read, the number of coins deposited). By way of example, the Arcade Flyer Archive provides a wealth of examples of cabinets marketed precisely on their earning ability in this manner (Konami’s candour is more than evident in its boast to arcade operators that with its Dancing Stage Fusion game, ‘We’ve got the music – you get the income!)


In the home, things are somewhat different in that certain game genres are wholly predicated on the ability to save progress and split gameplay across multiple sessions. Role Playing Games like those in the Final Fantasy or Legend of Zelda series are routinely marketed and discussed in terms of the number of hours of gameplay they offer (see the wonderfully-named Game Lengths website for a collaboratively-authored database of completion times). Clearly, there is no expectation that these quests be tackled in single settings and games such as these are designed to be played over multiple play sessions. That said, we should note the dedication of some players. As McFerran (2015) reports, ‘For one Super Famicom fanatic, the thought of losing his save file for Umihara Kawase was simply too terrifying to comprehend, so he decided to keep his console switched on for 20 years in order to preserve it.’


Even where save facilities are provided in games, their implementation varies considerably. It is very often the case the gameplay can be saved only at designated points in the game’s space and narrative. Capcom’s Resident Evil is a case in point in requiring the player to locate and use the virtual typewriters and ink ribbons strategically, and sparingly, scattered throughout the gameworld. Similarly, Nintendo’s Legend of Zelda series offers the ability to save progress though it is handled very differently across different titles across in the 30-year series. Most often, even where gameplay can be interrupted and saved at any point, the player is warped back to a point in time or space preceding the moment of saving revealing the the game actually hard codes specific save points into its structure. In this way, while the save may be initiated at any point, gameplay is still portioned in a manner dictated by the game’s designer and it remains rare that gameplay may be restarted from precisely the same point that it was suspended.


Unless, one is using an emulator. Of course, all the usual save points and functionalities that are designed and coded into the game remain operable (notwithstanding compatibility issues) but, on top of this, the ability to suspend the entire operation of the virtual machine is overlaid. This distinction is crucial. The emulator does not simply emulate the saving of the game nor does it allow the game to be saved in ways that override or confound that coding. The emulator provides no more typewriters or ribbons. Rather, the emulator saves the entire state of the console and every aspect of the software it is running. The emulator essentially allows the console, computer or arcade machine to run as a virtual machine and, crucially, it allows this machine to be suspended - and restarted - at any arbitrary point. The emulator effectively pauses time and puts the virtual console and the game running on it into a state of suspended animation. In this way, just as the emulator breaks the otherwise immutable interdependency between platform and game, hardware and software, the provision of the arbitrary save/suspend function, breaks the connection between the functionality of the game and the provision made in that game’s code. Saving within the game as intended and designed is overridden and rendered insignificant when the entire state of the virtual machine can be paused.


Saving gameplay at any point is a neat trick though we might argue that it lessens the jeopardy of certain gameplay encounters in the way that Crawford (1995) has noted in the analysis of the ‘save-die-retry’ game design of Doom II’s ‘Barrels o’ Fun’ level. The pressure involved in searching for a typewriter when under threat from hordes of the undead is surely a contributory factor in generating the ambience and aura of the early Resident Evil experience. However, the value to the scholar, preservation practitioner and exhibition designer of the emulator savestate affordance is immeasurable, particularly when we consider that we are not limited to just one save per game but can, instead, assemble many waypoints derived from different players and different playings. The creation of arbitrary savestates and the subversion of the logic of game design is already an important new affordance but ability to collate and curate multiple savestates that immediately points to the emergence of a rich and valuable means of documenting and accessing games. This ability to chart paths through the game and, absolutely crucially, to be able to return to them and reload the entire working state of the system, its code and the content of its data registers, affords the traversal of the game in non-linear ways that allow a disruption of narrative and spatiality. In doing this, the emulator allows us to tackle some of the most difficult challenges faced by those seeking to interrogate, interpret – and, quite frankly, play – videogames.


In the final section of this article, I will explore how I and the team at the National Videogame Museum are using the transformative affordance of the arbitrary savestate to create a variant of our Game Inspector exhibit to deal with three of the key problems faced by exhibition designers and game studies scholars alike:

  1. The Non-linearity of videogames

  2. The linearity of videogames

  3. The (un)predictable malfunctioning of videogames


1. The Non-Linearity of Videogames


The structures of videogames vary widely but titles such as Nintendo’s StarFox series ably demonstrate the prevalence of non-linearity and the impossibility of revealing the entirety of a game’s experiential, narrative or spatial potential in a single playing. Discrete routes through the game’s potentiality exist and are even explicitly traced out with way markers indicating the manner in which the space is available for traversal. As with Sega’s Outrun driving game that features branching pathways in the road, it is immediately apparent that the routes presented are mutually exclusive and that embarking on one necessarily closes down the possibility of accessing the other. As such, certain levels are only journeyed to via another path which necessitates a replaying of the game and a self-consciously different decision in order to explore the path untravelled.


James Newman

In games such as the Legend of Zelda series, the existence of particular pathways is not made so explicit and the consequences of choices made by players to venture one way or another are not revealed in the game. Indeed, that the player has made a choice through their playful interaction may not be evident to them at all as the decision in the game engine might be based on the amount of time a player took to complete a given task, whether they lost a life while trying, or could even be contingent on something as uncontrollable as random number generation. This latter point is excellently revealed in the deep studies of gameplay presented by players (often making use of code analysis) and notated in playthroughs of games such as Narcissa Wright’s commentated Ocarina of Time speedrun (Wright, 2013; 2014). In fact, as is very often the case with videogames where hidden spaces and features abound (often referred to as ‘Easter Eggs’ in an allusion to the imperative to hunt for them), there is more to the space than is initially revealed. New levels, new characters, new narrative trajectories and detours might be revealed only after certain conditions are met or the player moves through the space in a particular way or accumulates a certain number of points. Even accepting that many games employ a structure known as ‘bottlenecking’ that bring disparate narrative threads or routes through spaces together at particular points to facilitate shared moments of experience before allowing divergence again, it is perfectly possible that two players playing ‘the same game’ see, hear and experience markedly different adventures. Sonic the Hedgehog’s Special Stage that sends the blue Erinaceinae into a parallel universe of pseudo-3D but wholly psychedelic pinball that says as much about the aspirations of early 1990s graphic designers as it does game designers or the capabilities of the Mega Drive console, is only accessible if the player breaks the finishing point of a level having collected 50 rings. It is possible to enter the first Special Stage after just a couple of minutes of play just as it is entirely possible to complete the entire game without once entering a Special Stage. Super Mario Galaxy’s space literally transforms and is terraformed as play progresses with levels, places and opportunities for exploration disappearing as new ones are revealed.


James Newman

Challenge 1. In the face of such non-linearity, contingency and structural exclusivity, how might we provide access to specific moments or sequences of videogame? How might we exhibit or discuss a game whose extent is not clear or where two particular facets that we might wish to foreground require two completely separate and contradictory playings or that might be inadvertently navigated around by a player playing in a particular manner?


2. The Linearity of Videogames


Of course, in studying the structure and design, we soon note the non-linearity of videogame structure is often vastly overstated (see Ince, 2006, Pearce, 2002) and there remains a surprising amount of sequentiality evident in even the most apparent multicursal experiences. Indeed, fan practices such as ‘sequence breaking’, that exploit glitches and bugs within the game’s code, exist precisely to creatively subvert the inherent linearity of structures that forbid access to Level 2 before Level 1 is completed or that put certain (often increasingly powerful) techniques or equipment out of reach until specific sequences of gameplay have been negotiated (see Scully-Blaker, 2014).


The Special Stage in Capcom’s Street Fighter II in which the player is given a respite from pummelling another humanoid player and instead focuses their attentions on destroying a family saloon car with their bare feet and fists is a wonderfully excessive spectacle yet it is one available only after a number of victories have been chalked up. The linearity of structure that is evident in games such as Nintendo’s seminal Super Mario Bros. is notable in itself but where it becomes particularly problematic for the exhibition designer or scholar is that access to the game’s experiential potential is necessarily linked with competence and capability. Quite simply, reaching World 1-2 demands a level of proficiency in gameplay execution as obstacles are avoided and evaded by skilful running and jumping and the judicious use of attacking and defensive manoeuvres, that may not be available to every potential player. To reach the game’s denouement at World 8-4 is no mean feat. With no in-built saving feature and no ability to commence a new game from anywhere other than the start of World 1-1, to access the game is to be an expert player. The Super Mario Bros. cognoscenti will be aware of the presence of ‘Warp’ zones in the game that allow vast swathes of the game’s space to be skipped, but it goes without saying that such features are hidden and presuppose either a priori knowledge or a particularly exploratory approach to gameplay that disregards the countdown timer and imperative to race to the finishing flag. Moreover, while they allow the player to leap forward, the Warp Zones do nothing to alter the foundational linearity of the game’s structure. Indeed, as the game’s sequencing only allows forward movement warping from World 1-2 to 4-1 actually renders it impossible to even attempt the remainder of World 1 and the entirety of Worlds 2 and 3. It is no wonder, given the centrality of proficiency and the nontrivial nature of the actions required even to traverse such a ‘cybertext’ (Aarseth, 1997), that Aarseth in 2003 was so exercised about the need for scholars of videogames to hone their gaming chops!


James Newman

Challenge 2. How then, in the face of such linearity and a contingency on the knowledge and performance capability of the player, might we provide access to a specific moment or sequence in a game without requiring museum visitors to be expert players or to persevere through gameplay for many hours?


3. The (Un)predictable Malfunctioning of Videogames


So far, we have dealt with aspects of games as they are designed and the means by which hidden areas or the requirement for increased proficiency are coded into game designs in order to maximise their replayability and perceived value. Of course, some of the most fascinating features of videogames occur as a conscience of wholly unanticipated interactions between players and code and even between hardware elements. As we noted above, numerous games including the coin-operated versions of Pac-Man and Donkey Kong and the NES version of Tetris, each have a ‘killscreen' which, as the name implies, bring the game to a halt in an often glitchy mess of grabbed graphics and misplaced characters. Some games may even be encouraged to glitch through altogether more mechanical means as with Goldeneye 007’s ‘Get Down’ cartridge tilting technique.


I have written before on the Super Mario Bros. ‘Minus World’ which is a procedurally generated level brought into existence when a player clips through a wall in an unintended way and the game’s engine loads erroneous data that just happens to spawn a functioning (if experientially unremarkable) stage (see Newman, 2016). Examples of glitches and bugs are manifold with some, like the legendary ‘Glitch Pokémon’ MissingNo. and ‘M having given rise to their own subcultures and even necessitating Nintendo issuing statements to quell speculation that they were intentional and advising on how their capture might actually overwrite all game progress.


MissingNO is a programming quirk, and not a real part of the game. When you get this, your game can perform strangely, and the graphics will often become scrambled. The MissingNO Pokémon is most often found after you perform the Fight Safari Zone Pokémon trick. To fix the scrambled graphics, try releasing the MissingNo Pokémon. If the problem persists, the only solution is to re-start your game. This means erasing your current game and starting a brand new one (Nintendo n.d.).

MissingNo. and ‘M are the products of glitches. They manifest themselves as real Pokémon though, in fact, their names and graphical representations reveal their status as errors. MissingNo or ‘missing number’ refers to a data call in the program that searches for a non-existent Pokémon from the checklist and is represented on screen as a garbled mass of pixels, roughly in the shape of an inverted L and looking not dissimilar to a Tetris block. Both Pokémon are anomalies that arise from coding errors that arise under a specific set of repeatable conditions.


James Newman

The conditions are repeatable but, like the techniques and interactions required to enter the Minus World, they are by no means straightforward or reliable. The community-authored Bulbapedia resource details the most common way of summoning MissingNo in the Pokémon Red and Blue GameBoy games.


If we bear in mind that these instructions outline only the most common way of summoning MissingNo in one version of Pokémon and that countless other iterations occur in different titles, we begin to arrive at a sense of the difficulty of reliably accessing, citing or exhibiting such moments of highly contingent gameplay. And, if we look beyond MissingNo to game glitches that do not rely on such deterministic engines, we find examples that arise from the interactions and manipulation of pseudo-random number generators or that depend on fiendishly complex, frame-accurate precision in execution as is the case with many exploits and glitches in Nintendo’s Legend of Zelda series, for instance (see ‘Wrong Warping Explained’, for instance). With so many glitches either relying on or giving access to spaces and parts of the game world that are normally ‘out of bounds’, these coding anomalies give unique access and insight into the construction of spaces and the interactions between code and visual representation, yet they are remarkably difficult to execute with any degree of predictability.


Challenge 3: How then might we provide access to aspects of gameplay, to capabilities, or to game spaces and characters that are the products of glitches, coding anomalies and the unintended and often unpredictable interactions between data structures and player inputs?

Where to (Re)start?


One solution to each of these challenges is to set aside the original hardware and software altogether and to run the games under emulation in order to make use of arbitrary state saving for the suspension and recall of gameplay at the specific moments identified for their interpretative or citation significance. Placing gameplay within the container of emulation renders it malleable and affords the opportunity of building a library of access points that restart the game at any number of points, with different prior conditions, or as the result of different conscious, contingent or random occurrences. Returning to Altice’s point, this approach allows us to conceive of a means of citation that not only recognises the influence of the emulator on the gameplay experience, but that uses the affordance of the emulator savestate to share and make playable the precise moment of cited gameplay. Designing an interface to collect, capture and recall these savestates defines the current work of the NVM team and forms the basis for the next development of Game Inspector exhibits. In this way, this work dovetails with the pioneering research undertaken by Kaltman et al (2017) on the GISST Game and Interactive Software Scholarship Toolkit) project which seeks to devise an extensible architecture and online archival storage platform for such save data.


My previous work on game exhibition and interpretation has proceeded from the deliberately provocative thought experiment that play is too limited a methodological tool to reveal these multiple facets and complexities of videogames. In this sense, that work has self-consciously swum against the tide of the methodological orthodoxy in game preservation, exhibition and in game studies. It follows that the starting point of the Game Inspectors, as manifestations of this ‘gameplay preservation’ approach is that play is not simply important in configuring and constituting the videogame but that is is too important, too constitutive and too configurative. As such, there is a strong argument that play should be part of the object of preservation rather than its outcome. Certainly, one answer to the challenges set above is to capture instances of the acts and performances of play. To have archival recordings of the entry into the Minus World not only documents the process audiovisually but unlocks the interpretative potency of video editing. One of the counterintuitive consequences of capturing gameplay is that far from freezing it, it renders it malleable in new ways. Freeze framing, fast forwarding, rewinding, zooming in and out, altering playback speed, captioning, annotating and subtitling all add plasticity and interpretative opportunity. To view a recording of the Minus World being entered or an analysis of a world record performance of Super Mario Bros. or Ocarina of Time reveals ways of playing available only to the most knowledgeable and adept, while offering insight into the cultures and practices of play and the operation of code that would be hard to reveal through a first-hand exploration of the game.


However, such an approach undeniably removes the tactility of first hand performance and to call the removal of play from the videogame experience ‘provocative’ is perhaps understating things. However, by harnessing the affordances of emulation and the ability of such tools to transform the way in which games are played, replayed and accessed, we arrive at an intriguing position. It is a position in which play both retains its primacy as a means of exploring and experiencing yet also can exist within a curated, interpretative framework. Such a context is one that is in interaction and dialogue with the game as designed but that is not beholden to its structures and regimes. Accordingly, play can be guided and enabled in specific ways that are either hindered or perhaps even rendered impossible by the game in its original state. In essence, what we create is an environment in which the game can be played, encountered and interrogated on new terms. And it is by positively harnessing the transformative power of gameplay under emulation that this new interpretative framework emerges.


Be Kind Rewind


In this article, I hope to have demonstrated that even something as apparently simple as the provision of arbitrary saving of the machine state presents interpretative potentials unavailable to players of a game in its original form. I envisage this curated environment for play as wholly complementing the Game Inspector model which captures annotates and analyses gameplay. By guiding players through the complexities of linearities and non-linearities, by removing the contingency on skill and proficiency to access spaces and narratives, and by offering ways of exerting control over the unpredictable elements and interactions of hardware, software and performance, we are offered the opportunity to ‘inspect’ games in wholly new ways and to re-energise and democratise play as a means of exploration. In order to make games more accessible in this manner, however, it is imperative that we rethink our approaches to emulation. It is not necessary to divert attention from questions of authenticity and the search for fidelity in reproduction (notwithstanding the illusory nature of many of these reference points), but it is essential that we fully explore how the transformations that occur when games are operated and played under emulation can be positively utilised to aid interpretation.


And of course, the potency of the interpretative opportunities discussed in this essay arise simply from the arbitrary savestate. If we look further into the transformative functionality of current emulators we find yet more riches. The ability to rewind gameplay in real time is a feature recently added to the RetroArch framework upon which many videogame emulators are built. Rewinding, along with features to fast forward and fine grained controls over slow motion, bring some of the grammar of video editing to live gameplay as well as impacting on jeopardy, required skill. The ability to subvert structures, play with time and move through game spaces at macro and micro levels immediately opens up new ways of playing, exploring and negotiating. It is for this reason that at least part of the challenge of saving games will, doubtless, involve saving (and re-saving) games.


James Newman

References

  • Aarseth, E. (1997) Cybertext: Perspectives on Ergodic Literature, Baltimore and London: Johns Hopkins University Press.

  • Aarseth, E. (2003) ‘Playing Research: Methodological approaches to game analysis’, DAC, Melbourne (RMIT). http://hypertext.rmit.edu.au/dac/papers/Aarseth.pdf

  • Altice, N. (2015) I Am Error: The Nintendo Family Computer / Entertainment System Platform, Cambridge, MA: The MIT Press.

  • Apperley, T. and Parikka, J. (2018) ‘'Platform studies’ epistemic threshold’, Games and Culture, 13 (4), 349-369.

  • Aravani, Foteini (2016). ‘Play's the thing: keeping old games alive’, Museum of London [blog]. http://www.museumoflondon.org.uk/discover/plays-thing-keeping-old-games-alive

  • Banks, J. (2013) Co-creating Videogames, London: Bloomsbury Academic.

  • Bulbapedia (2019) ‘MissingNo.’, Bulbapedia: the community-driven Pokémon encyclopedia. https://bulbapedia.bulbagarden.net/wiki/MissingNo.

  • Crawford, C. (1995) ‘Barrels o’ fun’, Journal of Computer Game Design, Volume 8. http://www.erasmatazz.com/library/the-journal-of-computer/jcgd-volume-8/barrels-o-fun.html

  • byuu (2011) ‘Accuracy Takes Power: One Man’s 3GHz Quest to Build a Perfect SNES Emulator’, Ars Technica, 10 August. https://arstechnica.com/gaming/2011/08/accuracy-takes-power-one-mans-3ghz-quest-to-build-a-perfect-snes-emulator/

  • Ciolek, T. (2016) ‘NES Classic Edition: Game Review’, AnimeNewsNetwork, 15 November 2016. https://www.animenewsnetwork.com/review/game/nes-classic-edition/.108810

  • Consalvo, M., Mitgutsch, K., and Stein, A. (Eds) (2013) Sports Videogames, Abingdon: Routledge.

  • Conley, J., Andros, E., Chinai, P., Lipkowitz, E. and Perez, D. (2004) ‘Use of a Game Over: Emulation and the Video Game Industry: A White Paper’, Northwestern Journal of Technology and Intellectual Property, 2(2). http://www.law.northwestern.edu/journals/njtip/v2/ n2/3/Conley.pdf

  • Delve, J. and Anderson, D. (Eds.) Preserving complex digital objects, London: Facet Publishing.

  • EFGAMP (2017) Statement on the “Proposal for a Directive of the European Parliament and of the Council on copyright in the Digital Single Market”, (COM(2016) 593 final, 14.9.2016) from the perspective of the preservation of computer and video games as part of the digital cultural heritage. http://www.vigamus.com/efgamp/wp-content/uploads/2018/10/Statement_EFGAMP_on_Games-Heritage_final_2017_06_29.pdf

  • Evangelho, J. (2018). 2 Nintendo switch emulators are live and running gameplay, Forbes, 19 April 2018. https://www.forbes.com/sites/jasonevangelho/2018/04/19/2-nintendo-switch-emulators-areoperational-and-running-gameplay/#68d13ad1ef2f.

  • Fiske, J. (1987) Television Culture. London: Routledge.

  • Good, O. S. (2018) ‘Nintendo sues to shut down two big ROM sites’, Polygon, 22 July 2018. https://www.polygon.com/2018/7/22/17600008/nintendo-roms-lawsuit-cease-desist

  • Great Hierophant (2016) ‘What is wrong with the NES Classic Edition noise emulation?’, NESDEV forums. http://forums.nesdev.com/viewtopic.php?f=3&t=15073#p182305

  • Guttenbrunner, M., Becker, C. and Rauber, A. (2010) ‘Keeping the Game Alive: Evaluating Strategies for the Preservation of Console Video Games’, The International Journal of Digital Curation, 5(1). https://doi.org/10.2218/ijdc.v5i1.144

  • Ince, S. (2006) Writing for Videogames, London: A&C Black.

  • Kaltman, E. (2016) ‘Current Game Preservation is Not Enough’, How They Got Game. https://web.stanford.edu/group/htgg/cgi-bin/drupal/?q=node/1211

  • Kaltman, E., Osborn, E., Wardrip-Fruin, N. and Mateas, M. (2017) ‘Game and Interactive So ware Scholarship Toolkit (GISST)’, Foundations of Digital Games 2017. http://fdg2017.org/papers/FDG2017_demo_GISST.pdf

  • Linneman, J. (2017) ‘Nintendo Classic Mini NES review’, Eurogamer, 5 February 2017. https://www.eurogamer.net/articles/digitalfoundry-2017-nintendo-classic-mini-nes-review

  • List, J. (2016) ‘Linux on Your NES Classic Edition’, Hackaday, 13 November 2016. https://hackaday.com/2016/11/13/linux-on-your-nes-classic-edition/

  • Jenkins, H. (1993) ‘“x logic”: repositioning Nintendo in children’s lives’, Quarterly Review of Film and Video, 14(4): 55–70.

  • Kinder, M. (1991) Playing With Power in Movies, Television and Video Games: From Muppet Babies to Teenage Mutant Ninja Turtles, London: University of California Press.

  • Maiberg, E. (2018) ‘Nintendo's Offensive, Tragic, and Totally Legal Erasure of ROM Sites’, Motherboard, 10 August 2018. https://motherboard.vice.com/en_us/article/bjbped/nintendos-offensive-tragic-and-totally-legal-erasure-of-rom-sites

  • McDonough, J., Olendorf, R., Kirschenbaum, M., Kraus, K., Reside, D., Donahue, R., Phelps, A., Egert, C., Lowood, H. and Rojo, S. (2010) Preserving Virtual Worlds Final Report. http://hdl.handle.net/2142/17097

  • McFerran, D. (2015) ‘Weirdness: Gamer Keeps Super Famicom On For 20 Years To Preserve Umihara Kawase Save Data’, NintendoLife (31 Dec 2015). http://www.nintendolife.com/news/2015/12/weirdness_gamer_keeps_super_famicom_on_for_20_years_to_preserve_umihara_kawase_save_data

  • Monnens, D. (2009), ‘Losing Digital Game History: Bit by Bit’ in H. Lowood (ed.) Before It’s Too Late: A Digital Game Preservation White Paper”, American Journal of Play, 2(2) pp. 139-166. http://www.journalofplay.org/sites/www.journalofplay.org/files/pdf-articles/2-2-special-feature-digital-game-preservation-white-paper.pdf

  • Moulthrop, S. (2004) ‘From Work to Play: Molecular Culture in the Time of Deadly Games’, in N. Wardrip-Fruin and N. Harrigan (eds) First Person: New Media as Story Performance, and Game, Cambridge, MA: MIT Press, pp. 56–70.

  • NerdlyPleasures (2016) ‘A Better Alternative to the NES Classic Edition’, NerdlyPleasures, 11 November 2016. http://nerdlypleasures.blogspot.com/2016/11/a-better-alternative-to-nes-classic.html

  • Newman, J. (2012) Best Before: Videogames, Supersession and Obsolescence, London: Routledge.

  • Newman, J. (2016) ‘Mazes, monsters and multicursality. Mastering Pac- Man 1980–2016’, Cogent Arts & Humanities (2016), 3: 1190439. http://dx.doi.org/10.1080/23311983.2016.1190439

  • Newman, J. (2017) ‘Driving the SID chip: Assembly Language, Composition, and Sound Design for the C64’, GAME: The Italian Journal of Game Studies, 6/2017. https://www.gamejournal.it/driving-the-sid-chip-assembly-language-composition-and-sound-design-for-the-c64/

  • Newman, James. (2017) ‘World -1: Glitching, Codemining and Procedural Level Creation in Super Mario Bros.’ In Melanie Swalwell, Angela Ndalianis, Helen Stuckey (eds) Fans and Videogames: Histories, Fandom, Archives, New York: Routledge, pp. 146-162.

  • Newman, J. (2018a) ‘The Music of Microswitches: Preserving Videogame Sound—A Proposal’, The Computer Games Journal, 7: 261. https://doi.org/10.1007/s40869-018-0065-8

  • Newman, J. (2018b) ‘The Game Inspector: a case study in gameplay preservation Game Inspector: une étude de cas sur la préservation du jeu’, Kinephanos, August 2018. https://www.kinephanos.ca/2018/the-game-inspector-a-case-study-in-gameplay-preservation/

  • Nintendo (2003) The Legend of Zelda: Ocarina of Time: GameCube Collector’s Edition, instruction manual.

  • Nintendo (n.d.) ‘Game Boy Game Pak Troubleshooting - Specific Games’, nintendo.com.

  • https://www.nintendo.com/consumer/systems/gameboy/trouble_specificgame.jsp#missingno

  • Oakvalley, S. (2015a) ‘SID Emulation today is perfect, so why bother?’, Stone Oakvalley's Authentic SID Collection (SOASC=). http://www.65818580.com/post.php?id=000227022015013038

  • Oakvalley, S. (2015b) ‘The SID Chips visualized & benched’, Stone Oakvalley's Authentic SID Collection (SOASC=). http://www.6581-8580.com/post.php?id=000001092015181007

  • Pearce, C. (2002) ‘The player with many faces: a conversation with Louis Castle by Celia Pearce’ Game Studies: the International Journal of Computer Game Research, 2(2). http://www.gamestudies.org/0202/pearce/

  • Richretro (2017) ‘Super Nintendo Classic Mini Emulated Audio vs Real Hardware Audio’, Richretro, 4 October 2017. https://richretro.wordpress.com/2017/10/04/super-nintendo-classic-mini-emulated-audio-vs-real-hardware-audio/

  • Rosenthal, D. (2015) ‘Emulation & Virtualization as Preservation Strategies’, Mellon Foundation. https://archive.org/details/Rosenthal-Emulation-2015

  • Scully-Blaker, Rainforest. (2014) ‘A Practiced Practice: Speedrunning Through Space With de Certeau and Virilio’, Game Studies: The International Journal of Computer Game Research, 14(1). http://gamestudies.org/1401/articles/scullyblaker

  • U.S. Copyright Office, Library of Congress (2018) ‘Exemption to Prohibition on Circumvention of Copyright Protection Systems for Access Control Technologies’, Federal Register/Vol. 83, No. 208. https://www.govinfo.gov/content/pkg/FR-2018-10-26/pdf/2018-23241.pdf

  • Vowell, Z. (2009) ‘What Constitutes History?’ in H. Lowood (ed.) Before It’s Too Late: A Digital Game Preservation White Paper. American Journal of Play, 2(2) pp. 151-155. http://www.journalofplay.org/sites/www.journalofplay.org/files/pdf-articles/2-2-special-feature-digital-game-preservation-white-paper.pdf

  • Wright (2013) ‘Legend of Zelda: Ocarina of Time Speed Run in 0:26:34 by Cosmo #SGDQ

  • 2013 [iQue]’ SpeedDemosArchive [YouTube channel] Published on 23 Oct 2013. https://www.youtube.com/watch?v=_0N1lh1csGQ

  • Wright (2014) ‘Zelda Ocarina Of Time Speedrun In 18:10 By Cosmo [WR] [commentated]’. Archive.org (originally uploaded to YouTube). https://archive.org/details/ReuploadZeldaOcarinaOfTimeSpeedrunIn1810ByCosmoWRcommentated

  • Zainzinger, V. (2012) ‘Saving the game: Why preserving video games is illegal’, The Insider, 22 April 2012. http://thenextweb.com/insider/2012/04/22/saving-the-game-why-preserving-video-games-is-illegal/

Comments


bottom of page