-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathrelated.tex
22 lines (12 loc) · 5.62 KB
/
related.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
\section{Related Work}\label{sec:related}
Much of the discovered work in the field of networked video games describes systems with broad application that can be implemented in a number of ways. For example, \citet{Das1997NetEffect} describe a general architecture -- NetEffect -- with which multi-user video games or virtual worlds can be created, giving the example of a particular implementation built to demo NetEffect. The authors describe a number of interesting ideas, such as distributing the virtual world over multiple servers which handle subsets of that world, peer-to-peer voice chat, load balancing, and a number of other techniques primarily intended to reduce network traffic, especially inter-server traffic. However, it doesn't appear that NetEffect was used widely, and its emphasis on scalability was not something we found in the earlier networked video games that this paper analyzes. Furthermore, we will discuss concepts specific to the various games studied and perform a comparative analysis on their implementation, effectiveness, and flaws from a contemporary perspective.
\citet{Blau1992NetworkedEnvironments} follow a similar pattern. They describe Virtual Environment Realtime Network (VERN) -- an extension of DARPA's SIMNET which enabled physically distant users to interact and communicate in a virtual environment. VERN, like NetEffect, is a framework in which to build virtual worlds, and so this paper is once again different to what we present.
In other literature, the focus is on more specific solutions rather than generalized frameworks.~\citet{Singhal1995ExploitingReality} describe an alternative approach for positional information transmission -- instead of sending real-time position data or using dead-reckoning, position data with a timestamp is periodically sent instead, and clients estimate the object's position, velocity, and acceleration accordingly. They claim this solves an problem exhibited by \textit{SGI Dogfight} where the frequency of packet transmissions (tied to frame rate) affects smoothness of the rendering of other players. It would also mitigate the network saturation caused by the game's traffic when enough players were present.
\citet{YahnW.Bernier2003LatencyOptimization}, a Software Engineer at Valve Corporation (then Valve Software) presents solutions for latency compensation in the context of an architecture with a single authoritative server and many ``dumb'' clients. In essence, he describes how a user can feel as if they are interacting with other clients in real-time as they would in a peer-to-peer architecture, having the security benefits of a server-orchestrated architecture -- clients cannot deliberately broadcast an invalid state, for instance teleporting their character or changing their score -- while mitigating the downsides involved with high latency or connections.
\citet{Das1997NetEffect} and \citet{Blau1992NetworkedEnvironments} both note the use of dead-reckoning, in which clients only broadcast changes in direction or speed of a controllable game object, and are otherwise assumed to be moving at the same speed and in the same direction as they were at the time of the last broadcast~\cite{Blau1992NetworkedEnvironments}. This is in contrast to earlier video games which instead updated the character's current location and direction every time it changed, which naturally used significantly more bandwidth than dead-reckoning.
Overall, the body of literature covers generalized topics -- either broad systems or specific implementation details which may avail developers when building a networked game.
Other related works discuss the history of video games, in general or for specific games. For example, the \textit{IEEE Annals of Computing History} Volume 31, Issue 3 is dedicated to the history of video games and contains articles detailing the history of \textit{Pong} (1972)~\cite{lowood} and \textit{Nimbi} (1963)~\cite{jorgensen}, and how various display technologies were used in video games~\cite{bogost} among others.
Our work is adjacent to the cross-disciplinary field of archaeogaming, where old video games are treated as archaeological artifacts to be preserved, restored, and studied using archaeological techniques within the context of their time and the people who played them~\cite{aycock}. We lack the archaeological connection, but hope to accurately portray the technical details of the networked video games we discuss.
Works in this field consist of uncovering implementation details of retro games as well as the human element behind their creation and enjoyment.\footnote{This description does not encompass all the work involved in archaeogaming. See~\cite{aycock} for a more detailed explanation of archaeogaming.}
For instance \citet{Aycock2018} reverse engineer the central maze-generation algorithm behind \textit{Entombed} (1982) for the Atari 2600. By discovering a bug in this code, they were able to show reuse of this code in other Atari games, and used this as a vehicle to discuss the programmers behind it.
Perhaps the most important work to our review is \textit{The Lessons of Lucasfilm's Habitat} (2008)~\cite{morningstar}. It is authored by the original creators of \textit{Habitat} (1986) as a retrospective on the game, touching on technical challenges faced, implementation and architecture details and decision-making, and emergent behaviour observed from the player-base. This is an excellent primary source which gave us plenty of information when researching \textit{Habitat}. The paper goes on to advise future developers of virtual worlds with the lessons learned as well as ideas for future improvements to be made and the developers' philosophy on virtual worlds as a whole.