This essay has been submitted by a student. This is not an example of the work written by our professional essay writers.
Audio display concentrates on presenting visual based information in the form of sound. On the topic of audio gaming and referencing Johnny Friberg and Dan Gärdenfors (200-) states: "audio games are computer games that feature complete auditory interfaces, so they can be played without the use of graphics" .
A majority of the computer games available are solely designed for sighted users, they tend to focus on visual cues rather than auditory cues . They have a lot more added features such as 3D animation, graphics, and sound effects which make the experience of playing games much more exciting then how they were designed many years ago. However, it is quite challenging to design games that provide the same experiences in audio.
Although there are many different audio games available, they are not as widely available as games for sighted users. It was because of this reason that friends, Jeremie, Paul and Tim, created All inPlay, a website dedicated to providing "the world's first fully accessible online games" . All inPlay render games inherently for visually impaired users and have ensured compatibility with Window-Eyes and Jaws to carefully design each game taking every aspect into consideration. They have created audio gaming equivalents of Draw Poker, Texas Hold'em and many more.
AudioGames.net is another website that is dedicated to supplying audio games for visually impaired users. They provide users with a game list which holds a database of games available to download from a variety of other websites, making audio games accessible to all. It has a wide variety of games including Battleship, Blackjack and Sudo-San (Sudoku).
These audio gaming sites were designed based on fact that it was almost impossible to find a computer game that allows both sighted and visually impaired users to gain an equal experience. Therefore, this formed the grounds of researchers to experiment ways provide this service.
There are many different features an audio game must be able to express using only sound. Focusing on puzzle or strategy based games, how would a developer map such games in audio? A few aspects would involve the following; the initial stage would be to create the interface using speech, i.e. narratives of menus, tasks, results of commands, etc. By doing this, users can begin to establish the games environment and the primary goal to achieve. It is important that audio games allow for some narrations to be interrupted. If there is a need to select options from a list or a menu, the user should be able to quickly and easily interrupt the narrator and select their option. This will also help the user feel more in control of the task at hand.
A developer must also be able to present non-speech information such as movements, warnings, signals, etc. without confusing the user. This form of presentation is known as sonification, defined as the use of non-speech audio to convey information . This tool provides a form of interaction to make the user aware of their actions and surrounding within the game. In the case of a crossword puzzle, a buzzer sound may be played to alert the user that they have reached the end of a row or column.
Human Processing: Perception & Understanding
In contrast to the dark ages, the amount of information that humans are inclined to remember has become ridiculous. From telephone numbers and passwords to functions and procedures, the world is progressing everyday and as a result, our basic cognitive process has grown much more complex then once before. Regardless of the significance, this makes it very difficult to recall certain pieces information. However, for visually impaired users, the human processing system has a much greater task at hand, especially when dealing with certain activities that involve vision to reduce the memory load. This section will therefore present how the human processing model operates taking visual perception, and the lack of it, into deep consideration.
The human memory has three main memory functions; sensory buffers, short-term memory and long-term memory. A brief explanation of how information travels from one memory to another is as follows. The sensory buffers send information to the short-term memory by paying attention to it, this information is then either rehearsed to be stored in long-term memory, or simply forgotten. When rehearsed the information is passed into long-term memory where it is permanently stored and can be recalled from short-term memory when required for use.
The sensory memory acts as a buffer for stimuli received through senses. There are three separate types memory for each sensory channel: iconic memory for visual stimuli, echoic memory for aural stimuli and haptic memory for touch . The sensory buffer is part of the perceptual processor, it stores information for less then a second to allow the brain to develop a perception.
This information is only passed into short-term memory by paying attention to it, which allows the sensory buffers to filter the useful data and make space for new information. This is reason why we hear words or sentences as opposed to a sequence of sounds or speech, our brain makes sense of what we hear by assembling a meaningful sound before passing it to the next stage .
Short-term memory, or working memory, is used to store information that may be available for further processing. Information can be stored for about 18-20 seconds (Peterson & Peterson, 1959). It is often processed faster verbally as rather than visually, but both allow you to rehearse the data where it will then be passed onto the final stage where it will be stored permanently.
Long-term memory has an infinite capacity, hence intends to store information more permanently, requires rehearsed information. Tulving (1985)  suggested the useful distinction between three elements of long-term memory: Semantic memory which stores concepts and ideas, Episodic (sometimes referred to as "autobiographical" or "narrative" memory stores memories of events, and Procedural memory storing skills and "know-how" rather than "know-that" knowledge . Whilst the iconic memory utilised by the sensory buffer will not be used by visually impaired people, these three components are put to full use helping them to develop a better approach to remembering information.
An obvious, yet intelligent perception on memory, with respect to visually impaired people, written by Mr Z (2009) quoted, "people with congenital sensory defects remember things belonging to those senses. It would seem that people blind from birth do not remember in visions, nor dream in visions, at least not the way the rest of us [referring to sighted people] do. They see only their internal visualization of sensory data, not what we [again, referring to a sighted person] would see" .
Interacting Cognitive Subsystems
A researcher in the field of MRC Cognition and Brain Sciences Unit, Philip Barnard (2008), wrote an article explaining the Interacting Cognitive Subsystems (ICS). It talks about Macro-theory which is aimed at "understanding how all the different components of the mental mechanism are configured, what representations those components use and the overall dynamics of their interactions in real time" . The diagram below illustrates a 'nine subsystem mental architecture' which shows how information we input flows through our system and is output into physical representations.
There are different subsystems within this model:
Three Sensory Subsystems:
Ø Visual (VIS)
- subjectively, what we 'see in the world' as patterns of shapes and colour
Ø Acoustic (AC)
- subjectively, what we 'hear in the world'
Ø Body State (BS)
- subjectively, bodily sensations of pressure, pain, positions of body, as well as tastes and smell, etc.
l Two Effector Subsystems:
Ø Skeletal Action (Limb)
- subjectively, 'metal physical movement'
Ø Speech Output (Articulatory)
- subjectively, our experience of sub-vocal speech output
l Four Central Subsystems:
Ø Morphonolexical (MPL)
- subjectively, what we hear in our heads, 'our mental voice'
Ø Object (OBJ)
- subjectively, our 'visual imagery'
Ø Propositional (PROP)
- subjectively, specific semantic relationships 'knowing that'
Ø Implicational (IMP)
- subjectively, 'senses' of knowing (e.g. 'familiarity' or 'casual relatedness' of ideas), or of affect (e.g. apprehension, desire) .
In regards to visual and auditory imagery, examples will be used to make sense of Figure2 and how the subsystems correspond with one another.
Visual information enters the ICS via the visual subsystem, the information is then passed through the OBJ to convert it into an image in our mind. This image is then passed through PROP to allow you to make sense of it by applying some sort of meaning such as confirming that the initial input was a chair for example. The output may then either be speech or movement.
In parallel, sound arrives in the ICS via the acoustic subsystem and is passed to MPL which transforms it into the Morphonolexical code, a more abstract structural description of the underlying speech information . This is then passed through to IMP which renders an image to activate the mind to remember and create the sense of knowing what the information may be. This will add meaning to support the information to prove what you heard, and again it can be output through either one of the effector subsystems.
The ICS techniques can be used to analyse how we process information when playing games. According to a study by a computer scientist, Chris Johnson (1999), "It has widely been widely advocated as a constructive tool for interface development"  Johnson concentrated on the ICS model mainly because it considers both the psychological and cognitive state of users. When playing a game, the user must feel keep you occupied and engaged. When designing audio games, developers must understand how to exploit the tools used to engage users through sound focusing on how humans process sound data.
Use of Sound to Represent Information
According to a Sonification Report prepared for the National Science Foundation, "Sonification is defined as the use of non-speech audio to convey information. More specifically, sonification is the transformation of data relations into perceived relations in an acoustic signal for the purposes of facilitating communication or interpretation." The need to represent data through sound was extremely high and as technology advances, the needs are being met in ways that were never imagined. The best known success story of sonification is in space physics regarding the Voyager 2 spacecraft. It was confirmed to be travelling through a micrometeoroid field by using sound to detect micrometeoroids which could only be heard and not seen (Kramer, et al. 1999) . Since sonification has been used for such an immense study such as space physics, the possibilities of which sonification can be utilised in gaming are endless.
Sonification - Perceiving Visual and Audio Information
Whilst vision is considered to be the most important of our senses of which we obtain most our environmental information, hearing is assumed to play a minor role. Using visual information we are able to clearly identify our surroundings from a direct viewpoint. When we look at an object we can only the face of it (Figure3) and for that reason audio has an advantage as it allows us to perceive audible information from objects that are outside our viewing cone. Our hearing cone covers a wider range, hence combined with visual information, it allows us to draw a complete picture of our local environment, therefore enabling us to interact with it accordingly .
The only problem we face with hearing is what is known as the Cone of Confusion. This is where a listener would have difficulty in determining where the sound came from. Sound cues may come from above, below, in front or from behind you and because the cues are virtually identical, it is almost impossible to determine where they came from, hence cone of confusion.
Sonification Techniques - Auditory Displays
Auditory displays are systems where the user makes sense of the data they hear, for example data under analysis or data that represents states. It supports the user in understanding any information which is represented in sound output (Kramer, 1994) .
The Figureabove depicts Thomas Hermann's research on auditory displays. He highlights four stages of 'auditory displays including the information pre-processing system (A), the task, the techniques for data processing and computation (B), the sonification engine (C), sound signal amplification, the effectors (speakers, headphones) (D), the anticipated user' .
In basic terminology, auditory icons are visual icons but in audio, they play sounds which are related to the identified icon. For example, a visual representation of deleting a file would be a red cross and an auditory replica of this icon would simply be the sound of paper being scrunched up. This technique employs sounds that have a direct connection with the icon and the function or process of the item .
Focusing on visually impaired users and gaming, since the keyboard is the main form of interaction, auditory icons are interpreted using keys on the keyboard. In most cases the corresponding key would be used to perform an action required, for instance 's' would be to save the game, 'p' to pause, 'e' to exit and the arrow keys to manoeuvre.
Earcons depict information in a different form, it is described as a hierarchy of sounds . Lamont Wood refers to an important use of earcons by stating:
"a pilot's headphones will have a 3-D sound system so that the voice of another crew member will seem to come from that person, to avoid confusion. Audio "earcons" will signal certain events, a sucking sound will warn of fuel exhaustion, for instance. Speech input will be used to control non-essential systems" .
Another more contemporary and basic example was mentioned in an article about Bill Buxton's journal of Human-Computer Interaction on the use of non-speech audio in the computer's output. It states that,
"Small but meaningful sounds have become known as 'earcons,' by a punning analogy with 'icons.' To those who doubt their importance, Buxton says: 'all you have to do play a computer game like PacMan. Turn the sound off and your score will go down. It clearly contains information that is valuable'" .
Earcons are sound within an interface that have a symbolic mapping between the sounds and the data they represent. Walker states that the relationship between the earcon and the action is at most metaphorical. For example, a three note pattern in which a decrease in loudness and pitch represents a file being deleted and the reverse as a file is being opened .
Spearcons are another type of sound cues. They are created by speeding up a
spoken phrase until it is not recognized as speech. Spearcons have been proven to lead to faster and more accurate navigationy also lead to better performance and are more flexible. (Walker, Nance, & Lindsay, 2006), .
Sonification and Gaming
In terms of gaming and sonification, the techniques used need to provide the listener with enough information that will allow them to interact and navigate comfortably. There are many applications developed substituting visual knowledge with auditory descriptions. Example include, audio books and radio plays , audio only games , sonification techniques for scientific data sets  and assisted auditory displays for visually impaired people , .
According to an International Conference on Auditory Display (ICAD) meeting, there are three basic auditory elements of which every audible sequence in our environment is composed of: Speech, Music, and Natural or artificial sounds . Speech is verbal communication of information, e.g. narration. Music includes tones and harmonic compositions often used to trigger emotion. It is often played together with speech to present the information. Natural and artificial sounds cover a wide range of audio signals, they depict a physical object of process, for example driving a car, a knock on the door, or the wind blowing. These are just a few of many sounds which people tend to take no notice of, however when designing applications like games for visually impaired people, it is essential to capture each sound to such a degree that it allows the user to render the complete picture. This is one the most important features of sonification as it supports the users ability to control the game.
Audio Computer Games - Game Accessibility
"Accessible games must still be games" 
There are many audio versions of the computer games that everyone is aware of i.e. Connect Four, Battleship, even Formula 1 Racing Games. The more time that passes, technology advances and allows game developers to make many accessible games to cater to all. The games industry has shown an impressive development over the last few years. An article on audio games stated that in 2002, costs for the development of a game could vary between 300,000 Euros (approximately £262,144.78) for a game, to 30 millions for the biggest productions, involving hundreds of employees . Since 2002, gamers had been waiting for more impressive games and now looking back at 2009 and the years to come, it seems as though the possibilities are endless.
Visually impaired players should take full advantage of the continuously progressing gaming facilities available and begin playing audio games as early as possible. Not only can visually impaired people benefit a lot for their psycho-motor and cognitive development .
Designing any game for visually impaired players is quite a challenge, especially since the main feedback is in games that are usually visual. Even if audio is used more within most mainstream games, it is only used as a complimentary role in almost every case. Indeed it enhances the experience of the player, however it doesn't put forward any additional information that the player can't get visually, hence most games can be efficiently played with the sound switched off.
This may be the very reason why audio computer games are not as accessible, in fact before 2000, the only accessible games that could be found were developed by the Swedish Library of Talking Books.
During the 80's, the rise of personal computers has made computer access very easy for visually impaired users. With tools such as speech synthesis and Braille displays, visually impaired users were able to render information using alternative senses, mainly audio and speech.
In order for the user to navigate through the computer, specific software applications such as screen readers would need to be used, which reads out the information on screen to represent it in an appropriate way using speech or Braille, or both. The use of screen readers seemed like one of the best inventions at the time of console operating systems (text only screens) , however its flaws became apparent prior to the development of graphical user interfaces. This was soon updated so that screen readers can now operate with with multiple windows, accessing many different applications very efficiently. An article about audio games and visually impaired people highlighted that, "the improvements of screen technology during the 90's has allowed the development of enlargement software (screen magnifiers), which can enlarge text or pictures up to 32 times, which is necessary for partially sighted people" .
Furthermore, a number of devices which use the tangible modality or the haptic modality can be employed together with specific software such as screen readers, an example of a tangible modality is a tactile board, which is a device that allow overlays to be inserted on top. When pressed, the location is transmitted from the device. This helps young children adapt to computer games. The only flaw of this device is that the tactile overlays are static and not so user friendly as it can be difficult to insert them into the device.
Another more recent tangible device known as the tactile graphical refreshable display uses pins (similar to the pins used for Braille displays) to represent graphical surfaces with 16x16 pins or more. This specific device is quite expensive and is currently only being used in labs but according to an article, it may be introduced to the market within the next decade .
Haptic devices are those which include game pads or vibrato tactile joysticks. A specific example is the Senseable Phantom. ... RESEARCH OWN MATERIAL ...
No matter how well access software applications such as screen readers have been developed, they are useless when it come to actually accessing software. The information they render is mainly textual, hence if the graphical symbols or actions do not have a text based alternative they will remain unreadable. Likewise, if the information is not present in an application, for example if a dialogue box appears, the use of the screen reader is difficult if not in some cases impossible.
Accessibility frameworks have been developed to make applications more accessible and are available in the main environments such as Microsoft, Mac, Linux, etc. As described in research study on audio games, "Microsoft has developed Microsoft Active Accessibility, which is a COM-based technology that provides assistive technology tools with access to the relevant information... Then, to make their applications accessible, application developers have to implement the IAccessible interface" .
Mac developed a similar framework and also Linux desktop environments i.e. Gnome accessibility project.
Why Are Games So Different?
In terms of making things accessible, solutions work satisfactorily when considering desktop applications, not computer games. The concept of "working satisfactorily" is both not good enough and also hard to define within this context.
It is not as easy to measure the success of a game as it would be in the case of a desktop application. For instance, it would be easy to measure the time needed to write up a document using a word processing software. Likewise, we can observe a person playing a game and check how long it takes them to successfully complete a level or any other case relevant for that particular game, however this is just not enough. Unlike software such as word processors, games need to give the player some sort good feeling whilst and after playing it. There are definitely some emotional factors to consider that need to be taken into account. These factors are only needed to be taken into account with software desktop applications when they may affect its productivity but not so much other wise.
The nature of these types of feelings are not easy to describe. According to an Institute of Integrated Study, University of Linz, when talking about feelings, "it is a
large field of research and studies describe it through two main concepts: the concept of presence and the state of flow" . Both describe a state where the player feels totally immersed in the game universe. The presence concept is the ability of the game to provide the illusion for the player that he/she is in the virtual environment . The presence sensation can be evaluated according to the goals of the game . The efficiency of the interaction is also an essential point  .
The second concept of state of flow is quite similar to the concept of presence, but it is more directed towards a "sensation of intense pleasure rather than illusion to be elsewhere". Its is more considered to be defined as a state of concentration, deep enjoyment and total absorption in an activity . Flow is the result of the balance between two different psychological states, anxiety and boredom. These are two states which are produced by the gathering of two aspects of gaming: the challenge of a task versus the abilities of the player  (see figure 6).
Linking back to the statement "Accessible games must still be games", visually impaired people suffer from certain constraints on usability when using the same software as their sighted counterparts doing the same work. Using the example of the word processing software, even something as simple as checking to see if a word is in bold or not can become quite complex. Things that seem immediate by sight on any WYSIWYG word processing application are taken for granted and can sometimes be overlooked when considering applications for visually impaired people. There are some software solutions for problems like these, including LaTex which is a document mark-up language and document preparation system. It allows users, particularly visually impaired users, to stop worrying about the appearance of their documents (e.g. bold words, underlining, indenting, etc.) and allows them to concentrate more on the content.
In terms of games, it is not enough to simply find a solution to make the information within the interface accessible, it has to be made interesting, engaging and and as usable as the original game created for sighted players.
This sections will focus mainly on games that are specifically designed for visually impaired people. These games are usually funded by non-profit based organisations and foundations. Although these games are stimulating and well designed for visually impaired people, they are not as popular with the mainstream audience.
There are three main types of audio games; mainstream video rhythm games such as guitar hero, artistic musical experimentation games and the third type are those which are dependent on audio only without the need for visual display. Although these types of games do have a visual display, they are mainly used by the developers whose main sensory system is gained through vision, to make sense of the game they are designing. Likewise, the fact that the main modality is audio means that it is mainly accessible for visually impaired players.
Research shows that "in 10 years, over 400 audio games have been developed... and development teams contain generally only between one and four persons" . This is just a small fraction in comparison to the number of computer games that have been developed by a much larger teams. There is no doubt that the market for audio games is not as large as the market for computer games but this is mainly because they are funded by a foundations with not as much support from the developers of games designed for sighted people. With enough financial backing from successful developers such as Capcom or Nintendo, visually impaired people can get the games they deserve which are not just satisfactory games but are designed to give a true audio equivalent in every way possible. Despite such shocking statistics, there has been a positive rise in the production rate throughout the past few years. With thanks to sites such as Audiogames.net the community of players and developers has straightened, which implies that this upcoming medium is "a very promising phase of expansion" .
Categorisation of Audio Games
In the same way that computer can be categorised, audio games too have different categories which are very much the same. According to Stéphane Natkin, 2006, a Professor Chair in Multimedia Systems, in order to categorise audio games, we use a categorisation of computer games as a theoretical basis of thinking . Natkin distinguishes them into four separate types of games; action games, adventure games, puzzle games and strategy games. Each category will have its own dedicated sub-section to explain in more detail.
The categorisation of games mentioned by Thomas Gaudy, 2006 is similar, containing the same four categories, however they are not exactly the same. They are more grouped in accordance to the way they are perceived by visually impaired players. Figure 7 below illustrates how Gaudy proposes his view of audio games categorisations in the article, Proceedings of Handicap 2006 Conference .
There is no doubt that audio games have not had the same historical success as video/computer games. They have not had the same evolution . Audio games owe their existence to video games but also to textual games . Due to the lack of sight, the next best thing to perceiving information is through text. Management and simulation based game would be far more complex containing complex rules of game play, hence would of course be more textual then audio. Action and exploration games that have much simpler rules involve the use of non-verbal sounds. These type of games can be considered as a mixture of levels of interaction using both action and explorations techniques. The rules of puzzle based games can be quite simple, however due to the lack of being able to represent a 2D board through text or audio in a straight forward way, the game play can come across to be much more complex then it really is.
Natkin (2006) describes actions games as games whose main focus is the dexterity of the player . Typical action games such as Doom, Quake, and Halo which take place in an abstract universe, are the type of games that emphasise on physical challenges, including hand-eye co-ordination and reaction time. However, other games such as Dance Dance Revolution (DDR) are not as obviously recognised as being part of the same category because it is more of a softer physical challenge. The game involves a visual display and a mat which the player uses as part of the challenge. On the mat there are four arrows that correspond to what is shown on the screen and the player must move their feet on these arrows to copy the dance steps shown on the screen. Playing these types of games require good synchronisation with the tempo of the music being played in the game. The visual display allows the player to anticipate the time of interaction and the tempo can go from slow to fast making it quite physically challenging. This is not an accessible game because the action expected from the player is given visually, however it does depend on the audio and visual perception.
Games like DDR can be made accessible if reconsidering some of the aspects of the original DDR game. If not making it a dance game, it can easily be changed to make it an ear-hand co-ordination game which stimulates the players memory. As opposed to a mat used for moving their feet, the player can use a hand mat with four arrows which can correspond to certain noises or tones they hear from the game and press the arrows to copy the sounds and make the music they hear.
There are also audio action games, where the success of the player depends on
an interaction based on a precise timing . An example would be Tampokme, and the principle of the game is much more simpler than DDR. Time is still essential but there is no possibility of anticipating it. It recognizes sounds, reflexes and inhibition. The player has to identify various kinds of audio signals and react accordingly and in fixed time. The state of flow is not as intense as in DDR but the potential of improvement is high and it is fully accessible to visually impaired people as well as to people who have mobility troubles as it can be played with a single button .
Adventure Games/Exploration Audio Games
These are the types of game which are mostly plot driven, a story wherein the player is the protagonist . They involve three essential features; an interesting scenario, the ability to explore new realms or worlds and activities of solving riddles or goals to achieve. A famous example is the series of Zelda. The game play consists of a mixture of action, adventure, puzzle solving and role playing. It involves an avatar of the player in a virtual environment of which the main goal is to rescue Princess Zelda.
There are also audio adventure games which have similar scenarios to that of the video/computer games, but the features are not as advanced especially the puzzle solving feature. The main focus of these games are to be able to explore a new geographic environment, hence in the case of audio games, categorised "exploration games". Descent into Madness is a prime example of an exploration audio game as it consists of an avatar who wakes top find that they have been kidnapped and is being forced to be tested by scientists. The players goal is to solve puzzles and piece together clues presented through voice recordings left behind by the scientist. The game contains over 300 sound files. This is a very clever use of speech audio as it allows the player to engage in an quest whilst fully utilising the features a conventional adventure game consists of making it easy and pleasant to handle.
Strategy Games/Management-Simulated Audio Games
Strategy games are the type of games where the outcome is determined by the players decision making skills. The crucial factor that makes these types of games different from the rest is that all players have an equal degree of cognition of the components of the game. The test is on the players mental skills but the game is often contrasted with luck and the outcome relies on probability. The word "strategy", adopted from the military jargon, refers to using tactics to plan at a high level. Building an Army is an game which is purely materialistic, you have to deploy troops and other armories very carefully to maximize the effectiveness of your attack .
There are not very many strategic audio games simply because they are very difficult to simulate without vision. 1000 AD offers an interesting approach of a strategy based game, but is very much detached. The game play involves training troops, building defences and buildings, feeding your people and running your country wisely . The aspects that are mostly highlighted in this game is management of resources and simulation.
The last of the four categories are games that put emphasis on solving puzzles.. A famous example would be Brain Training games i.e. IQ games, they test the players ability to solve problems based on skill including logic, pattern recognition, strategy, completion of words and sequence solving. These games involve a variety of logical and conceptual challenges, they occasionally add time-pressure or other action-elements . All puzzle games help exercise your brain to help keep your brain sharp and into shape.
There are many audio puzzle games that are quite similar to the video/computer games. The game Battle Chess is an adaptation of the Chess game into a video
game. Likewise, K-Chess advance is also an adaptation of the Chess game, but focussing on audio. In such cases the sate of flow does not necessarily reply on the reactivity before audio or visual signals, instead it relies on the entertaining principle that "stood the test of time before the advent of computers" . These games are a particularly interesting transformation from visual to audio because most of the games are entirely 2 dimensional which are not as easy to render.
***These few paragraphs below are not part of the section but I wasn't too sure if I should stick it in there or not***
To be more specific, puzzle games such as cross words and word searches are particularly harder to create in audio, however there are still quite a variety of such games available. Two dimensional board games fall into the same category as they are more difficult to reproduce in an audio representation due to the amount of information the player needs to know and also remember. This section therefore focus on the conversion from 2 dimensional games to audio and how they are achieved.
With 3 dimensional games, undoubtedly they are not simple to create, yet it is still easier to represent particular information such as sound effects of movements, actions, reactions, etc. beginning with a simple 3D game such as
If you imagine a 2D game, for example word search, a common technique is to start from beginning to end over and over each time when finding a new word. This is because the human memory stores this information in short term memory because it is not necessary to store it for longer than only a few seconds, hence making it difficult to remember where to search from when finding the next word. This is similar for almost all puzzle based games.
Johnny Friberg and Dan Gärdenfors., 200- decade certain but not year. Audio Games: New Perspectives on game audio. [Accessed 11th November 2009].
*** If this is a paper frm the web, you need to give a web address, and the date you saw it. If it is from a book or journal, you need its name and the year of publication.
Niklas Röber & Maic Masuch., 2002. Auditory Game Authoring: From Virtual Worlds to Auditory Environments.[On-line] Available at: http://isgwww.cs.uni-magdeburg.de/~nroeber/data/Roeber_2004_AGA.pdf [Accessed 11th November 2009].
Jeremie & Paul., 1996. All inPlay. [On-line] Available at: http://allinplay.com/howitworks.html [Accessed 11th November 2009].
International Community for Auditory Display Editorial Committee, et al., 1997. Sonification Report: Status of the Field and Research Agenda http://www.icad.org/websiteV2.0/References/nsf.html [Section 2, Introduction] [Accessed 11th November 2009].
Figure1: [On-line] Available at: http://www.s-cool.co.uk/alevel/psychology/human-memory/models-of-memory.html [Accessed 11th November 2009]
http://www.cc.gatech.edu/classes/cs6751_97_winter/Topics/human-cap/memory.html. Antherton J S., 2009. Learning and Teaching; Memory [On-line] UK: Available at: http://www.learningandteaching.info/learning/memory.htm [Accessed: 16th November 2009].
Tulving E., 1985. "How many memory systems are there?" American Psychologist vol. 40 pp 385-398. [Accessed 16th November 2009]. Mr Z., 2009. "Is there a limit to human memory" [On-line] Available at: http://scienceblogs.com/thescian/2009/02/is_there_a_limit_to_human_memo.php [Accessed: 16th November 2009].
Philip Barnard., 2008. "Interacting Cognitive Subsystems" [On-line] Available at: http://www.mrc-cbu.cam.ac.uk/research/emotion/cemhp/phil.barnard/ [Accessed: 21st November 2009].
Figure2: https://intranet.dcs.qmul.ac.uk/courses/coursenotes/DCS224/ics/week_10.pdf [Accessed 21st November 2009].
Stuart Booth, Thore Schmidt-Tjarksen "Psychological Theory in Haptic Interface Design: Initial Steps Towards an Interacting Cognitive Subsystems (ICS) Approach" [On-line] Available at: http://www.eurohaptics.vision.ee.ethz.ch/2001/booth.pdf (page 3 of 6) [Accessed 22nd December 2009].
Chris Johnson., 1999. "Using Cognitive Models to Transfer the Strengths of Computer Games into Human Computer Interfaces" [On-line] Available at: http://www.dcs.gla.ac.uk/~johnson/papers/um99/games.pdf [Accessed 22nd November 2009].
Kramer, G., Walker, B., Bonebright, T., Cook, P., Flowers, J., Miner, N.; Neuhoff, J., Bargar, R., Barrass, S., Berger, J., Evreinov, G., Fitch, W., Gröhn, M., Handel, S., Kaper, H., Levkowitz, H., Lodha, S., Shinn-Cunningham, B., Simoni, M., Tipei, S., 1999. The sonification report: Status of the field and research agenda, Report prepared for the National Science Foundation by members of the International Community for Auditory Display. Santa Fe, NM: ICAD. [On-line] Available at:http://www.icad.org/websiteV2.0/References/nsf.html [Accessed 30th December 2009].
Figure3: http://en.wikipedia.org/wiki/Viewing_cone [Accessed 31st December 2009]
Figure4: Stephen Perrett PhD., 2009. Human Auditory Localisation. [On-line] Available at: http://www.thegong.com.au/auditorylocalization/ [Accessed 31st December 2009].
G. Kramer., 1994. An introduction to auditory display. In G. Kramer. (Ed.) Auditory Display: sonification, audification, and auditory interfaces. Reading, MA, Addison-Wesley. [Accessed 31st December 2009]. Figure5: http://sonification.de/main-ad.shtml. [On-line] [Accessed 31st December 2009].
S. Camille Peres, Virginia Best, Derek Brock, Barbara Shinn-Cunningham, Christopher Frauenberger, Thomas Hermann, John G. Neuhoff, Louise Valgerur Nickerson, Tony Stockman- "Auditory Interfacesâ€Ÿ, chapter 5, pp 147-195 [Accessed 31st December 2009].
Anon., 2002. Sonification, Music and Emotion. [On-line] Available at: http://www.cs.uiowa.edu/~kearney/22c296Fall02/CrittendonSpecialty.pdf [Accessed 11th November 2009].
L. Wood. Fashion takes high-tech turn," The Chicago Tribune, March 4, 2002.
T. Durham. "The human touch - Professor Bill Buxton's hopes for improvements in the man-machine interface," The Guardian, August 4, 1988. [Accessed 31st December 2009]
Bruce N. Walker and Gregory Kramer., (2004). "Ecological psychoacoustics and auditory displays Hearing, Grouping and Meaning Making.â€Ÿ [Accessed 31st December 2009].
http://www.audiobooks.com. Website, 2003. [Accessed 31st December 2009].
http://www.audiogames.net. Website, 2003. [Accessed 31st December 2009].
S. D. Speeth. Seismometer Sounds. In Journal of the Acoustical Society of America, volume 33, pages 909-916, 1961. [Accessed 31st December 2009].
Robert W. Massof., 2003. Auditory assistive Devices for the Blind. In ICAD, pages 271-275. [Accessed 31st December 2009].
C. Frauenberger and M. Noisternig., 2003. 3D Audio Interfaces for the Blind. In ICAD. [Accessed 31st December 2009].
N. Röber and M. Masuch., 2004. Interacting with Sound: An Interaction Paradigm for Virtual Auditory Worlds. [On-line] Available at: http://www.icad.org/websiteV2.0/Conferences/ICAD2004/papers/roeber_masuch.pdf [Accessed 31st December 2009]
Dominique Archambault, et al,. 2007. Computer Games and Visually Impaired People. [Accessed 5th January 2010]
Natkin, S., 2006. Video games and Interactive Media: A Glimpse at New Digital Entertainment. AK Peters. [Accessed 5th January 2010].
Hildén A. and Svensson, H., 2002. Can All Young Disabled Children Play at the Computer. In Miesenberger, K., Klaus,J., and Zagler, W., editors, Proc. ICCHP 2002 (International Conference on Computers Helping People with Special Needs), volume 2398 of LNCS, Linz, Austria. Springer. [Accessed 5th January 2010].
Witmer, B. G. and Singer, M. J., 1998. Measuring presence in virtual environments: A presence questionnaire. Presence, 7(3):225-240. [Accessed 5th January 2010].
Gaggioli, A., Bassi, M., and Fave, A. D., 2003. Quality of experience in virtual environments. In Riva, G., Davide, F., and Ijsselsteijn, W., editors, Being There: Concepts, Effects and Measurments of User Presence in Synthetic Environments, chapter 8, pages 121-136. [Accessed 5th January 2010].
Retaux, X., 2003. Presence in the environment: theories, methodologies and applications to video games. PsychNology Journal. [Accessed 5th January 2010].
Johnson, D. and Wiles, J., 2003. Effective affective user interface design in games. Ergonomics, 46(14):1332-1345. [Accessed 5th January 2010].
Jenova, C., 2001. Flow in games. MFA Thesis, University of Southern California. [On-line] Available at: http://www.jenovachen.com/flowingames/ [Accessed 5th January 2010].
Gaudy, T., Natkin, S., and Archambault, D., 2006. Classification des jeux sonores selon leur type de jouabilit. In Proceedings of Handicap 2006 Conference, pages 221-226, Paris. [Accessed 6h January 2010].
Apple., 2008. Strategy Games. [On-line] Available at: http://www.apple.com/games/gettingstarted/strategy/ [Accessed 6h January 2010].
1000 AD., 2002. AudioGames.net. [On-line] Available at: http://www.audiogames.net/db.php?action=view&id=1000 AD [Accessed 6h January 2010].
Skyler Miller., The History of Puzzle Games. [On-line] Available at: http://research.microsoft.com/en-us/news/features/memorylens.aspx [Accessed 6h January 2010].
- Earcon definition with examples: http://www.wordspy.com/words/earcon.asp
- See Talboks-och Punkstkrifts Biblioteket at: http://www.tpb.se/english/computer_games
- See for instance Handytech GWP (Graphic Window Professional): http://www.handytech.de/en/normal/products/for-blind/gwp/index.html
- Senseable Technologies Inc.
- See http://msdn.microsoft.com/at
- Tampokme, the audio multi-players one-key mosquito eater, CECIAA/CNAM CEDRIC/UPMC INOVA, http://www.ceciaa.com/?cat=lois&page=action
- Audiogames.net Descent into Madness: http://www.audiogames.net/db.php?action=view&id=descentintomadness
- AudioGames.net 1000 AD: http://www.audiogames.net/db.php?action=view&id=1000 AD
- KChess Advance, ARK Angles, http://www.arkangles.com/kchess/advance.html