Haptic feedback in shared virtual environments can potentially make it easier for a visit ally impaired person to take part in and contribute to the process of group work. In this paper a task driven explorative evaluation is presented. of collaboration between visually impaired and sighted persons in three applications that provide haptic and visual feedback. The results show that all pairs could perform all the tasks in these applications even though a number of difficulties were identified. The conclusions made can inform design of applications for cooperation between visually impaired and sighted users.
Computer usage today is predominantly based on graphical interaction, where the visual presentation of information is essential both for input (hand-eye coordination when using a computer mouse), and output (seeing the information on a computer screen). This can create difficulties for blind computer users, both at an individual level when interacting with a computer, and also when collaborating with other computer users.
The work presented in this thesis has investigated interaction for blind computer users in three stages. First investigating access to information by making studies on an interactive audio-only game, drawing conclusions about auditory direct manipulation and auditory interface design. Second studying collaboration between blind and sighted computer users in two different contexts, leading to questioning of commonly expressed design principles regarding access to collaboration. Finally studying accessibility in a working environment, finding out how technology, the assistive device used by the blind person, communication with others and professional knowledge interplayed to create an accessible work environment.
Based on these empirical studies, the main conclusion from this work is a proposal of a research perspective, Assistive interfaces as cooperative interfaces. Here, the context where the interface is going to be used is in focus, and cooperative and social dimensions of interaction are acknowledged and highlighted. The design and analysis of assistive devices should be highly sensitive to the socio-interactional environment, and not just focusing on the single individual using an assistive device.
This paper presents a study of cross-modal collaboration, where blind and sighted persons collaboratively solve two different tasks using a prototype that has one auditory and one graphical interface. The results shows the importance of context and the design of tasks for the accessibility of cross-modal collaborative settings, as well as the importance of supporting the participation in a working division of labour.
The needs of blind and visually impaired users are seriously under-investigated in CSCW. We review work on assistive interfaces especially concerning how collaboration between sighted and blind users across different modalities might be supported. To examine commonly expressed design principles, we present a study where blind and sighted persons play a game to which the former has an auditory interface, the latter a visual one. Interaction analyses are presented highlighting features of interface design, talk and gesture which are important to the participants’ abilities to collaborate. Informed by these analyses, we reconsider design principles for cooperative interfaces for the blind.
This paper presents the results from a qualitative case study of an auditory version of the game Towers of Hanoi. The goal of this study was to explore qualitative aspects of auditory direct manipulation and the subjective experience from playing the game. The results show that it is important to provide a way of focusing in the auditory space. Articulatory directness was also an important issue and feedback should support the movement of the objects in the auditory space.
This paper presents a study of an auditory version of the game Towers of Hanoi. The goal of this study was to investigate the nature of continuos presentation and what this could mean when implementing auditory direct manipulation. We also wanted to find out if it was possible to make an auditory interface that met the requirements of a direct manipulation interface. The results showed that it was indeed possible to implement auditory direct manipulation, but using Towers of Hanoi as the underlying model restricted the possibilities of scaling the auditory space. The results also showed that having a limited set of objects, the nature of continuos presentation was not as important as how to interact with the auditory space.