Since Christmas is coming up I will in this blog post break my series on haptic interaction and talk about a specific memory from my research (rather than discussing an application area in general). The memory I’m presenting here comes from evaluations performed in schools several years ago.
The evaluations were performed within the scope of an EU project (MICOLE), which aimed at developing tools for supporting collaboration between sighted and visually impaired pupils in elementary school classes. One of the major problems today is that sighted and visually impaired pupils use different work materials which are not easily shared, sometimes causing them to do “group work” in parallel rather than together. By developing environments based on haptic feedback, which could be accessible to both sighted and visually impaired pupils, the project tried to address this important problem.
One of my contributions to the project was a 2D (or rather 2.5D) interface presenting a virtual whiteboard on which geometrical shapes and angles were drawn (felt as raised lines). Two haptic devices were connected so that one sighted and one visually impaired pupil could be in the virtual environment at the same time. The tasks, during the evaluations in schools, were to go through the shapes and angles respectively together and categorize them. These kinds of tasks were chosen since we knew that the pupils in the schools where we performed the evaluations had just started to learn geometry.
When we came to one of the schools we soon realized that the pupils had just started to learn geometrical shapes, but they had never talked about angles – they could not discriminate between right, acute and obtuse angles. Therefore we needed to explain how to discriminate between the different types of angles before the evaluation started. When a task presenting 10 different angles (some right, some obtuse and some acute), forming an irregular star-shaped figure, had been loaded the visually impaired pupil took a few minutes to explore the interface in order to locate the angles in the figure. After this, the pupil moved directly to one of the angles and categorized it, correctly, as acute – something that also the sighted pupils agreed on. They then browsed through the angles, one by one, always focusing on the angle the visually impaired pupil was currently “touching”. They always came up with the right answers!
I really enjoyed watching this group working in the interface. It was so obvious that the application could be used as a shared interface for discussing a concept which was new to everyone. It was equally obvious that the visually impaired pupil, after just a few minutes of exploration, knew exactly how to navigate to the different angles in the star-shaped figure (without having to follow all edges!). This enabled the visually impaired pupil to take and keep the initiative during the discussions! Thus, the interface clearly, at least in this particular case, enabled inclusion of the visually impaired pupil in the group work. Watching this group work in the interface, and especially seeing the visually impaired pupil leading the discussion, is definitely one of the greatest memories I have from my research on haptic interaction.
I will get back to this study, and related studies, in a later blog post related to communication in haptic interfaces. Those who are curious about the study I briefly introduced above can also read this article.
Merry Christmas to everyone!