OUTCOME

This project was presented at ASSETS 2018. After my internship, my team published the library as a Unity plugin

OVERVIEW

THE PROBLEM

Most video games are currently designed for sighted gamers only. Making tools for game design and gameplay that have built-in support for accessibility can take us closer to a world where video games are accessible by default.

A key to incorporating accessibility in video and audio games involves designing interactions with spatial audio and 3D sound localization. If objects, scenes, and occurrences in the virtual game world are represented by localized sounds, it’s possible for blind gamers to perceive the video game environment.

How might we design a spatial audio interaction toolkit that makes all video games accessible to people with vision impairment?

MY ROLE

UX Designer
UX
Researcher
Game Developer

DURATION

5 Months

TOOLS

Unity
Audacity
HoloLens

Solution

Localize Objects in View
Get a sense of position of objects with respect to you in the game world

With the feature called Bodyscan, player gets a spatialized audio listing of objects in field of view. This feature enables the player to form a mental model of the virtual environment. Players can filter objects based on direction and distance with respect to them.

Select and Reach Things
Receive guidance to reach any object in the game world

Player can select a gameobject from field of view to perform additional actions. Selected object can be reached with the help of a tone. Tone modulation in volume and pitch as the player moves acts as a guide to reach objects.

Summary of Scenes
Get a crisp summary of your field of view at checkpoints

Viewdios (like videos) are zones that convey a crisp audio summary of scenes at various spots in the game. A tone of varying pitch helps the player orient to face the scene before the audio playout. This feature strengthens the player’s mental model as she/he moves around in the virtual environment.

My Contribution

I was responsible for Interaction Design, UX Research, and Development. When I started my internship, I took initiative to drive the project with guidance from my mentors.

I was responsible for UX research – I conducted all the feedback sessions and usability tests.

I developed all the prototypes using HoloLens and Unity with guidance from my mentors.

I also actively contributed during ideation, literature review and documentation. Towards the end of the project, I uncovered issues to be addressed in next iteration of the project.

Process

RESEARCH

I studied around 30 research articles to understand and gain insights about the latest spatial audio interfaces developed. I also read blogs, websites, and news articles throughout my internship period to stay upto date with latest trends.

Persona

DESIGN

Interaction Tools
Bodyscan

The Bodyscan tool enables the gamer to understand what objects are in the game environment and where they are located. When the player activates this feature, they hear a localized audio playlist of game objects in their adjustable field of view.

Select and Reach

With the select and reach feature, the player can reach objects of interest with audio cue guidance. Audio cue’s tone modulation in volume and pitch gives a sense of closeness and relative direction to reach the object.

Viewdio

Viewdios are audio hotspots which describe the view when the player enters a new gameplay area. A tone of varying pitch and volume, similar to the ‘select and reach’ tone guides the player to orient towards the scene.

Iteration 1

After designing the interaction tools, it was time to test them. Ideally, we would have liked to start with a simple prototype. However, my Manager needed to present ongoing work at an upcoming internal conference, so we decided to implement the interaction tools in a Skyrim-like video game. An example gameplay video is shown below.

Although the conference was a success and my manager was happy with my work, it took only a few user studies for me to find that the game was too complicated for gamers (blindfolded or blind) to learn and use the interaction tools. A complex game with a lot of flexibility increased the cognitive load of the participants. We reached the conclusion that we need to make a very simple video game for initial testing. Moreover, I decided to test only a subset of the tools to avoid information overload on the users.

Iteration 2

I made a simple treasure hunt game to test the interaction tools. The setting was an office room environment where gamers had to look for jewels. I split my participants into 2 groups – a control group and an experimental group.

While the control group was asked to find jewels without the interaction tools, the experimental group used the interaction tools to do the same. Although both groups were able to complete the treasure hunt, the experimental group finished the game faster and found it more enjoyable. However, none of the participants could remember their relative placement, i.e. they could not form a cognitive model of the game environment. This was an issue because participants need to have an accurate mental model to perform complex tasks and enjoy the game.

Iteration 3: Designing for HoloLens

The team hypothesized that it was absence of proprioception in the PC based video game that lacked formation of cognitive models. We decided to test this using the HoloLens as Mixed Reality would enable participants to physically walk in the room and thus form mental models.

I built a similar treasure hunt game for the HoloLens and superimposed real-world objects with virtual objects. The treasure hunt room in the game world for the PC was modeled after Microsoft’s halls in the office building where I would conduct usability testing with participants using the HoloLens.

EVALUATION

User Testing process

To test the interaction tools, I conducted within subjects user studies with 6 users. Participants went through a Task-based A/B testing session. I used the treasure hunt video game on the PC as a control while I tested the same game on the HoloLens. Here’s a brief overview of the process:

  • Introduction: The project goal and describe the scenario.
  • Pre-test questions: Demographics and past gaming experiences
  • Gameplay: Video Game Tutorial walkthrough – assist the participant to explore and understand the game and interaction tools for about 20 min and let them practice finding game objects using interaction tools
  • Testing: Find same game objects without the interaction tools
  • Post-test questions: Follow up questions about perception of object locations and gather feedback.
  • Thank them and compensate for participation.

The task was to reach 4 specific objects (chair, flowerpot, books, and cupboard) in the room. For the PC, users wore stereo headphones and used an Xbox controller to play the game. For the HoloLens study, users simply used the clicker that comes with the HoloLens. None of the participants had used a game controller before, and only one participant (P4) had experience with playing audio games. Find participants’ data from user studies with task time for the studies on the PC.

Results

The overall positive feedback was positive – all participants were able to use the interaction tools successfully on the PC and HoloLens.

Results showed that 5 out of 6 participants formed a cognitive model of the environment in 20 minutes when the HoloLens was used. Being able to move in a room helped participants gauge relative positioning of objects. For both the PC and the HoloLens, all participants were able to find and reach objects in the testing phase and were excited about being able to navigate a virtual game room. Participants quickly overcame difficulties of using the controller and HoloLens, and asked for increased game complexity.

“I don’t play (PC/mobile) games because I have a prejudice that games are largely inaccessible. But this game has changed my perception.”

“I could feel myself inside the room. I heard the voice exactly from the objects’ location, so it helped me to know where I should move and take turns. I am imagining, and I am playing.”