communication · Haptics

Haptic communicative functions


This is the fifth post in my blog series about haptics as an interaction modality and this time I start focusing on my main area – collaboration in multimodal interfaces. In earlier posts I have written about:

Ever since I started working with my master’s thesis project back in 2005 I have been focusing on collaboration in multimodal interfaces and specifically on how one can design haptic functions for collaboration and communication between two users working in the same environment. Collaborative haptic interfaces has been around for quite some time – one of the first examples is an arm-wrestling system using specialized hardware enabling two users to arm wrestle over a distance. Other commonly known, early, examples enabling a kind of mediated social touch are HandJive and InTouch. One thing these early systems have in common is that they use specialized hardware which is quite limited in scope. More examples of early systems and in-depth discussions about mediated social touch can be found in this excellent review.

During the two recent decades more widely applicable collaborative haptic functions for collaboration in virtual environments have been developed. Such functions can e.g. enable two users to feel each other’s forces on jointly held objects or make it possible to “shake hands” by utilizing magnetic forces between the two users’ proxies in the virtual environment. One of the earliest examples of an environment supporting these kinds of collaborative haptic functions, or guiding functions as I use to call them, is the collaborative text editor developed by Oakley et al.. Apart from the obvious functions needed to edit a document each user could also use a Phantom device to find positions of and/or communicate with the co-authors. An example of a haptic function was a kind grabbing function (similar to the shake hands function mentioned above), making it possible to grab another user’s proxy moving it to another part of the document. Other examples were a kind of “locate” function dragging one’s own proxy to another user’s proxy by a constant force or a “come here” function dragging another user’s proxy to one’s own position.

Later examples of virtual environments enabling these kinds of haptic collaborative functions are a collaborative drawing application developed at CERTEC, Lund University and applications for joint handling of objects evaluated by my former Ph.D. supervisor Eva-Lotta Sallnäs Pysander during her years as a Ph.D. student (thesis link). The latter examples are most relevant for me, since much of my work during my period as a Ph.D. student at KTH focused on collaborative interfaces in which two users can work together to move and place virtual objects. I will get back to my own applications later on, in a blog post about haptic interfaces supporting collaboration between visually impaired and sighted persons.

Above, I have just provided a few examples of collaborative haptic functions which can be used to control the forces provided by one or more haptic devices. The functions I think are most interesting to explore are the ones that enable physical interaction between two haptic devices in that both users can feel each other’s forces on jointly held objects or feel each other’s forces when holding on to each other’s proxies. These kinds of functions enable interesting means of communicating physically in virtual environments, especially in cases in which the users are not able to talk to each other face-to-face or point on the screen. Imagine, e.g. a scenario in which two users are exploring different parts of a complex multimodal interface showing distributed data clusters (what those clusters represent is not of importance here). In such an interface it would be very cumbersome to try to describe to the other person where a certain interesting cluster has been located. In this case the user who wants to show something s(he) found can grab the other user’s proxy and drag him/her to the relevant cluster. This possibility would probably simplify communication about the explored dataset (explaining where you can find details in a complex interface can be extremely cumbersome). This is, of course, just a made up example but it can be applied to many scenarios especially in cases where important parts of an interface are visually occluded. I will get back to joint handling of objects in a later blog post in this series. I will discussion the potential of using haptic feedback when exploring huge datasets in one of the upcoming blog posts.

Last, it is interesting to contrast functions enabling physical interaction between two haptic devices with functions only enabling a one-way communication (like the “goto”-function mentioned above). Using a one-way function enable some kind of communication in that one person’s proxy is “dragged” to another one’s, but the haptic function is only applied to one of the users in this case – there is e.g. no way for the other user to tell if the one being dragged actually wants to be dragged. When using a two-way haptic communicative function both users can feel forces from each other enabling a richer communication. Apart from enabling joint handling of objects, where the coordination of movement is made possible by both users feeling the other one’s forces on the jointly held object, two-way haptic communicative functions make it possible to e.g. clearly communicate to the other user that you do not want to be dragged somewhere. The potential these functions can have in situations where visually impaired and sighted users collaborate in virtual environments will be the topic of my next post in this series!



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s