Video game designers and players have long been fascinated and inspired by historical stories and figures. As we delve into the history of how video games have used and distorted stories from the past, it becomes apparent that there is a lingering demand for historical education through entertainment, a reminder that people are constantly looking for new ways to immerse themselves in the past. to engage and find meaning.
Efforts to educate through video games started with computers. In 1973, for example, the Minnesota Educational Computing Consortium, which partnered with the state university system and the Minnesota Department of Education, began using computer technologies to improve student learning. A decade later, Minnesota boasted of having 10,000 computers in its public schools, with a ratio of 73 students per computer — reportedly the highest ratio in the country at the time.
The consortium also led the way in creating computer-based course materials, including its most famous release, “The Oregon Trail.” Originally a text-based game for school use, “The Oregon Trail” was released in 1975 to students and educators throughout Minnesota. The game, which was later released by Apple, Microsoft and others, is a strategy video game in which the user becomes a chariot leader who led settlers across the frontier in the 1840s. The player must make important decisions along the way, including choosing the best path, when to hunt and how to avoid diseases like dysentery. Designed to encourage skills such as planning, strategy and memory, the game was a success.
Such nostalgia thrived in the 1970s, when many Americans looked to the past in new, compelling ways in anticipation of the country’s bicentennial and social, economic, and political turmoil. As Malgorzata Rymsza-Pawlowska has argued, it was in the 1970s that history “was as much about feeling as thinking, about being in the past rather than looking at it.” Immersive video games helped bring history to life in the 1970s, as did new TV shows from the era, historical reenactments, oral history projects, and museum exhibits.
While Atari and his groundbreaking 1975 game ‘Pong’, a virtual simulation of a game of table tennis, showed the potential market for home video games, commercial sales remained in check over the next decade. Then, in 1985, Japan’s Nintendo released its Nintendo Entertainment System in the United States. That release quickly marked the success of now classic games such as “Super Mario Brothers” (1985) and “The Legend of Zelda” (1987). In 1990, Nintendo accounted for 90 percent of the United States’ $3 billion spending on video games, with a study suggesting that the main character Mario had become more recognizable to American children than Mickey Mouse. These commercial successes also meant that educational gaming would undoubtedly become a backseat to entertainment.
As new consoles entered American homes, including 1989’s Sega Genesis, many games hit the market, some of which sought to link historical themes of conquest and empire-building with modern skills of success and work ethic. For example, Nintendo released the military strategy game “Genghis Khan” in 1990. The game allowed up to four players to devise a conquest strategy on behalf of England, the Byzantine Empire, the Mongol Empire, or Japan – taking on challenges along the way. As a 1989 review of the computer version of the game noted, “Conquerors must be calculating, charismatic, and cunning and courageous,” and this video game promised its players such lessons.
Similarly, players of 1991’s “Civilization” (originally on MS-DOS, but subsequently released through various other platforms and consoles) were asked to build and grow an empire over thousands of years, seeing civilization through military assignments, urban growth and settlement. Players who became imperialists sometimes faced oppositional civilizations they may have read about in their history books, from Alexander the Great to Napoleon Bonaparte. These games rewarded players for their perseverance and determination to conquer and colonize.
By the 1990s, another video game genre was also firmly entrenched: World War II. In the United States and elsewhere, such focus on war has further entrenched a nationalistic battlefield memory that emphasized the role of individual struggle and violence. Often this happened through the lens of the first-person shooter, which is often separated from the broader strategy of warfare. For many of these games, history was often more of a backdrop or backdrop than a source of education. The focus on entertainment was similarly reflected in the original releases of the hit games “The Medal of Honor” (1999) and “Call of Duty” (2003) and their many sequels and imitations.
The ability to weave a strong nationalistic narrative into video games is certainly not unique to the United States or limited to its historical background of World War II. In 2012, the Cuban government released “Gesta Final” to teach Cuban youth about the 1959 Cuban Revolution. This game also uses the first-person shooter format to tell a state-approved story about the origins of the revolution and the successes of its people.
In short, video games reveal a lot about our culture – be it educational initiatives or political agendas. They’ve also become a way for gamers to engage with important questions about public history and a shared past — albeit one made for the game itself. Game designers and programmers often use a generic or fictional museum or heritage site, for example, to give the player an opportunity to learn a particular past that is necessary to further develop the character’s storyline. In this way, while primarily a form of escapism, these functions can function much like a museum in the non-virtual world where visitors pull out fragments of the past to form ideas about who they are as a people.
Whether fictional or not, incorporating museums and historical and archaeological sites into video games can also tell us something important about changing attitudes towards accessibility and preserving history. While museums have become spaces of reverence and exclusion, their virtual manifestations have offered players a different experience. Numerous video games, including 2019’s “World War Z” about fighting zombies, require deadlocks in museums or cemeteries that even lead to the complete destruction of those virtual institutions.
Many museums today have taken a page from the successes of video games to better engage their audiences, especially younger people, in interactive and sensory experiences. In 2016, the American Museum of Natural History unveiled “MicroRangers,” an app game that interacts with the museum’s exhibits to help children learn.
Video games may make history more accessible, but there is a downside as these historical experiences are often presented without a critical or analytical lens or guidance. Still, we may wonder whether the consumption of alternative and imaginary pasts in the virtual world – even accidentally – can serve as a lure or distraction to face and deal with the horrors and injustices cemented in our history. .
However, some data suggests something more optimistic. A 2020 survey found that 93 percent of historical video game players felt inspired to learn more about a particular event or person in history, while 90 percent believed video games had the power to change people’s perspectives on a particular event. change historical event.
As the video game market continues to grow and advance, so does its ability to create and disseminate knowledge of the past. Those who study history would be wise to heed the call.