Typhlex: Controlling a Mobile Screen Reader with Deformable Input for Blind Users

Published in IEEE Pervasive Computing, Special Issue on Flexible and Shape-Changing Interfaces, 2017

This paper reports an effort in developing deformable user interfaces for the visually impaired, including the iterative design process of the prototype, and the gesture set it supports as an alternative input to a mobile screen reader.

We found that bend gestures were easily understood, performed, and enjoyed; especially when the gestures are mapped well to screen-reading actions (e.g., bend corners to navigate, squeeze to select). We also found how important the material design plays in the usability of the prototype – we had 10+ versions varying groove location, width, and depth.

Reflection

I think this project is a really good example of how emerging technologies can help user groups with specific needs. The concept itself is quite simple: detect degree of deformation (mostly bend here). Yet, this dimension is only available with bend sensors and flexible materials.

I feel that as we invent new ways of interacting with computers, we should also think about accessibility and how these ways can benefit not just the “typical user”. If we want technology to be truly pervasive, everyone should be able to use it.

Recommended citation: M. Ernst, T. Swan, V. Cheung and A. Girouard/ "Typhlex: Exploring Deformable Input for Blind Users Controlling a Mobile Screen Reader". In IEEE Pervasive Computing, vol. 16, no. 4, pp. 28-35, October-December 2017, doi: 10.1109/MPRV.2017.3971123.
Download Paper