games · Haptics · Multimodality · sonification

Paper on the effect of auditory feedback on gaze behaviour accepted to SMC 2017!

SMC_accept

Earlier this week I wrote about a paper that was accepted to the Frontiers in Education (FIE) 2017 conference, but the fact is that yet another paper which I co-authored was accepted to another conference, Sound and Music Computing (SMC) 2017, earlier in May! Emma Frid (lead author), Roberto Bresin and Eva-Lotta Sallnäs Pysander from the department of Media technology and Interaction Design at the Royal Institute of Technology (KTH) are the other authors on that paper. The title of the SMC paper is “AN EXPLORATORY STUDY ON THE EFFECT OF AUDITORY FEEDBACK ON GAZE BEHAVIOR IN A VIRTUAL THROWING TASK WITH AND WITHOUT HAPTIC FEEDBACK”.

The paper is based on a small part of an extensive study, focusing on the effect of haptic and audio feedback on perception of object qualities and visual focus, performed a few years ago. In this particular paper we use eye-tracking metrics to investigate if auditory feedback in particular affects gaze behaviour in an environment where the task is to pick up a ball and throw it into a target area. We looked at both the effect of sound in general and effects of different sound models. Like in many other studies we have been involved in, conditions with different modality combinations were compared against each other. I will write more about the results when the paper has been presented and there is a link to the published proceedings. Search for the title given above if you want to find the specific session and listen to Emma’s presentation at the conference!

Here is the abstract, summarizing the main points:

This paper presents findings from an exploratory study on the effect of auditory feedback on gaze behavior. A total of 20 participants took part in an experiment where the task was to throw a virtual ball into a goal in different conditions: visual only, audiovisual, visuohaptic and audiovisuohaptic. Two different sound models were compared in the audio conditions. Analysis of eye tracking metrics indicated large inter-subject variability; difference between subjects was greater than difference between feedback conditions. No significant effect of condition could be observed, but clusters of similar behaviors were identified. Some of the participants’ gaze behaviors appeared to have been affected by the presence of auditory feedback, but the effect of sound model was not consistent across subjects. We discuss individual behaviors and illustrate gaze behavior through sonification of gaze trajectories. Findings from this study raise intriguing questions that motivate future large-scale studies on the effect of auditory feedback on gaze behavior.

As was the case with the FIE paper mentioned earlier, the SMC paper is just presenting a small part of a large study, so there is definitely a lot more to tell about the study and the different parameters measured. I will return to the overall study as soon as more papers are out!  🙂

 

games · Haptics

Haptic feedback in games

falcon

Now it’s time for the forth post in my blog series about haptics as an interaction modality. In this post, I will write about games – an area where I think haptic feedback can be used in a much greater extent than it is today. The earlier posts in this blog series were:

Haptic feedback has been used in games for quite some time. I think that everyone has some kind of relation to the joysticks used in e.g. flight or car simulators. Most joysticks do not only enable some kind of steering, but also generate haptic feedback often in the form of vibrations or resistance to motion. If we take an ordinary flight simulator joystick as an example, the player can experience heavy vibrations when the plane is stalling, as a kind of warning that the lift is beginning to decrease.

During recent years new input devices have been developed with the potential to really change the way we experience different kinds of games. I have already introduced the Phantom Omni in earlier posts – a device that makes it possible to not only feel texture, stiffness, friction, etc., but also to lift and move around virtual objects. This clearly opens up new possibilities for game development, especially since the Novint Falcon (picture above) started to spread. As far as I can understand haptic feedback is, in the vast majority of games where this kind of feedback is utilized, still limited to vibrations and resisting forces despite the fact that modern devices greatly widen the possibilities. Below, I will add a few thoughts about what can be done to utilize the unique aspects of haptic feedback in games. There are, of course, many more things you can do apart from the ones discussed here.

Imagine, e.g. a haptic game where the player not only has to worry about navigating to the right place and/or interacting with different objects, but also need to watch out for deadly magnetic wells “physically” pulling the game avatar towards them. That would certainly add a unique dimension to a game, as would magnetic “guides” pulling the user in a certain direction making him/her aware that e.g. an object is approaching. Every year students are creating simple games, based on magnetic objects which should be avoided, in the haptics course at KTH. Here is an example video from a simple game where the user need to navigate through a mine field to find a treasure! It is easy to add more levels and objects, so the game is quite scalable and the idea can be applied to many different scenarios. Another game from another course round used a similar idea – that you should avoid being dragged into objects – but in that case the objects had different widths and were moving from right to left. The user should stay clear of the objects for as long as possible.

There are many games out there today which are based on the exploration of large and different environments. Zelda and the Final Fantasy series are among the most known examples. In those kinds of games haptic feedback could also add an interesting dimension, when it comes to categorizing objects and/or explore occluded areas hidden behind or within buildings, trees or cliffs. In these kinds of games you still need ordinary input controllers, of course, but a haptic device could be used as a complement. Imagine that you walk around in a large virtual environment and come to a well which you cannot go down into. You could then switch to a haptic mode and send down a probe to feel what is at the bottom. If something is down there you could also pick it up. You could even take this further and have small puzzles in hidden places (like in the well example), where you need to feel differences between e.g. friction, surface texture and/or weight of different objects. If you place the objects in the correct order you could unlock some secret.

Haptic feedback could also be used a lot more in puzzle and maze games – there are quite a few of them out there today. If you add a haptic feedback dimension to a puzzle game you can e.g. use weight and texture of different pieces as additional input. A haptic-only puzzle would be very interesting to try out! You can also play around with haptic mazes and use friction, texture and maybe even magnetic forces to provide additional information about where you are, provided that you cannot see your own location. Quite a few projects in the haptics course have been based on haptic mazes.

Above, I have sketched on a few ideas on how one can utilize some unique aspects of haptic feedback in games. Since we already have the technology, I think it is important that we try to take a step further from games where haptic feedback is limited to vibration, resistance and indications of getting shot at and instead look at more creative ways to use haptic feedback. There are some creative solutions out there today, but I think many games could still benefit from using e.g. the ideas discussed above!