I’m currently in the process of writing a job application for an associate professorship and to make sure I don’t miss anything I recently browsed through my old conference folders to find the articles to append. When I came to EuroHaptics 2010 I was reminded about something I had completely forgotten – I was actually one of the organizers behind a workshop at that conference! I spent quite a lot of time preparing for the workshop, which focused on haptic communicative functions, but the weekend before the conference I got very sick and was forced to cancel my participation. I will take this opportunity to briefly introduce the workshop and discuss some of my work prior to it. My earlier blog posts on haptic feedback as an interaction modality were the following:
- Introduction to haptics as an interaction modality
- Haptic feedback in medical applications
- A great memory
- Haptic feedback in games
It was a shame that I could not attend the workshop/conference in Amsterdam, since it was based on my work on collaboration in multimodal environments up to that point. Thus, this was the perfect opportunity to discuss the work performed and get input regarding future work in the area. We described the focus of the workshop in the following way:
In this workshop, concrete examples will be presented and discussed in terms of how the touch modality can support communication and collaboration. Also, the technical challenges of distributed haptic feedback will be addressed. The target audience of the workshop is researchers and practitioners focusing on haptic feedback supporting people in settings where more than one user are involved. We invite other researchers and practitioners to share their research and experience from their diﬀerent projects focussing speciﬁcally on the collaborative perspective. It might be that the collaborative aspects in your project have not yet been addressed. In that case, interesting collaborative aspects can be identiﬁed during the discussions in this workshop.
Quite a lot of work was performed by me and my “multimodal colleagues” 🙂 prior to the workshop. First of all, I had performed my master’s thesis work back in 2006 which focused on collaboration between visually impaired and sighted pupils in elementary school. Evaluations were performed in schools, where visually impaired and sighted pupils collaborated in dynamic collaborative environments where objects could be moved. During that work, and especially during a re-analyses performed during my first year as a Ph.D. student (2008) I realized that communicative functions based on haptic feedback had a real potential both when it came to supporting collaborative work and supporting inclusion of the visually impaired pupils in group work with sighted peers. It became especially clear that haptic functions for guiding (holding on to the same object or holding on to the peer’s proxy) can replace verbal guidance to a large extent.
Imagine a situation where you need to guide a visually impaired pupil to a particular place in a virtual environment. If you only have the visual feedback to rely on when establishing a common frame of reference, you need to talk a lot like “go down, more down, …no, too much, go back…, now to the right…no, not down, up again… here it is!”. If you have haptic feedback available you can just grab the other person’s proxy and move the visually impaired peer to the right place and just say “here”. Needless to say, haptic feedback affects the dialogue between collaborators in this case. If you want to learn more about this explorative study you can read the journal article we finalized a few years after the study.
One problem that was evident from the evaluations with the visually impaired and sighted pupils was that the visually impaired pupil was not aware about what the sighted pupil did when haptic guiding was not utilized. This is why we performed a follow-up study where we added sound cues to the above mentioned dynamic interface to provide feedback on actions taken in the interface (e.g. grasping and putting down objects). We compared the new visual/haptic/audio version to the original visual/haptic one. We managed to show that the dialogue between the collaborators differed depending on which program version they worked in and it was also clear that the work was more effective (measured as time to complete task) in the visual/haptic/audio version. Once again, we could also clearly see how access to haptic feedback influenced the communication. You can read more about this study in this article.
These two studies resulted in a theoretical conference paper presented at HAID (Haptic and Audio Interaction Design) 2009, where we tried to develop a kind of conceptual model regarding haptic communicative functions’ effects on the dialogue between collaborators. This was my first attempt at a meta-analysis of the insights gained within this research area. The paper summarizes and discusses all the effects on the dialogue I had seen in my studies thus far. The paper made quite an impression – it is still the most referenced of all the papers I have produced up until today! At that point we were still quite unique when it came to haptic collaborative environments and I still think I’m one of the very few researchers who study the effect of haptic and audio feedback on the dialogue between collaborators.
The HAID conference paper laid the ground work for the workshop described in the beginning of this post and during the workshop the idea to study collaboration between two sighted persons was introduced and discussed. Next time I write about my earlier work I will introduce my latest study on collaborative multimodal interfaces, that showed that haptic and audio feedback indeed have effects on the dialogue between sighted persons as well!