eHealth · games · Summer school

EHealth summer school in Stockholm, day 3!

As I wrote in earlier blog posts we will spend four days in this week at KTH, Stockholm. We spend those days in the visualization studio. Most of the time we spend at KTH the room will be used to attend lectures and do group work, but this particular day was quite different. Today, we got to try out all sorts of demoes and equipment in the studio as well as 3D-modelling and game development! At the end there was also a short lecture, by interaction designer and professor Jonas Löwgren from Linköping University. He talked about research through design and that e.g. physical artefacts can add something which cannot really be mediated through text and still images. 

The first part of the day was devoted to the demoes, which we tried out after a short walkthrough by Björn Thuresson who is the manager of the studio. I will present some of the games briefly below, without going into technical details. 

In the image above you can see a VR racing car game which is unusual in the way that it requires collaboration. The entire track, as well as a moving indicator showing the car’s position, can be seen on the table top screen. While the program is running, the person sitting in the chair with VR glasses and a steering wheel drives the car while at least one other person gives directions and makes sure that the holes in the track are filled with the square puzzle pieces! 


In the image above you can see another collaborative game. In this game you are controlling a little penguin and the task is to walk around and collect objects in the virtual world. In the world there are lakes (which you can drown in), fields and mountains. The main problem is that most of the objects cannot be reached unless you collaborate with someone who works in the sand box! There is a direct mapping between the topography in the sandbox and the one in the virtual world. If an object is too high up the one controlling the sand box need to start working in order to build up a mountain under the object.  


My last example above shows my colleague from Uppsala University, Ida Löscher, trying to move around in a virtual world (seen on the big screen) collecting objects. The big problem here is that another person can place barriers (in the form of pillars) in the virtual world with the help of a mobile interface, where both the world and player moving around in the world are shown! Once again a multi-user interface, but this time it’s more like a competition. 

These are just a few examples of the many things we got to try out. I really liked the collaborative aspect of the games and I think many interesting research ideas can be born here. All of the games were the result of student projects and I know that some of them have been awarded prices. 

The next part was focused on 3D modelling. Robin Palmberg, a research engineer working in the studio,  introduced the concept whereafter he guided us through a tutorial where we tried out creating a small robot like character in Blender. At the end we could export the code for 3D-printing. The printed robots will be handed out on Friday! I actually understand Blender much better now. 

The last part of the day focused on game development in Unity! We were devided into five different groups in which we focused on different parts of a game. The different parts were:

  1. Physics
  2. Texture and light
  3. Sound
  4. Mechanics
  5. AI

Every group was guided by an expert KTH student! This was a very interesting experience and I really enjoyed seeing the end result which was a combination of the different groups’ work. It was a very hard game, where you should jump between different plarforms but the important thing is that the collaborative effort worked!

games · Haptics · Multimodality · sonification

Paper on the effect of auditory feedback on gaze behaviour accepted to SMC 2017!

SMC_accept

Earlier this week I wrote about a paper that was accepted to the Frontiers in Education (FIE) 2017 conference, but the fact is that yet another paper which I co-authored was accepted to another conference, Sound and Music Computing (SMC) 2017, earlier in May! Emma Frid (lead author), Roberto Bresin and Eva-Lotta Sallnäs Pysander from the department of Media technology and Interaction Design at the Royal Institute of Technology (KTH) are the other authors on that paper. The title of the SMC paper is “AN EXPLORATORY STUDY ON THE EFFECT OF AUDITORY FEEDBACK ON GAZE BEHAVIOR IN A VIRTUAL THROWING TASK WITH AND WITHOUT HAPTIC FEEDBACK”.

The paper is based on a small part of an extensive study, focusing on the effect of haptic and audio feedback on perception of object qualities and visual focus, performed a few years ago. In this particular paper we use eye-tracking metrics to investigate if auditory feedback in particular affects gaze behaviour in an environment where the task is to pick up a ball and throw it into a target area. We looked at both the effect of sound in general and effects of different sound models. Like in many other studies we have been involved in, conditions with different modality combinations were compared against each other. I will write more about the results when the paper has been presented and there is a link to the published proceedings. Search for the title given above if you want to find the specific session and listen to Emma’s presentation at the conference!

Here is the abstract, summarizing the main points:

This paper presents findings from an exploratory study on the effect of auditory feedback on gaze behavior. A total of 20 participants took part in an experiment where the task was to throw a virtual ball into a goal in different conditions: visual only, audiovisual, visuohaptic and audiovisuohaptic. Two different sound models were compared in the audio conditions. Analysis of eye tracking metrics indicated large inter-subject variability; difference between subjects was greater than difference between feedback conditions. No significant effect of condition could be observed, but clusters of similar behaviors were identified. Some of the participants’ gaze behaviors appeared to have been affected by the presence of auditory feedback, but the effect of sound model was not consistent across subjects. We discuss individual behaviors and illustrate gaze behavior through sonification of gaze trajectories. Findings from this study raise intriguing questions that motivate future large-scale studies on the effect of auditory feedback on gaze behavior.

As was the case with the FIE paper mentioned earlier, the SMC paper is just presenting a small part of a large study, so there is definitely a lot more to tell about the study and the different parameters measured. I will return to the overall study as soon as more papers are out!  🙂

 

games · Haptics

Haptic feedback in games

falcon

Now it’s time for the forth post in my blog series about haptics as an interaction modality. In this post, I will write about games – an area where I think haptic feedback can be used in a much greater extent than it is today. The earlier posts in this blog series were:

Haptic feedback has been used in games for quite some time. I think that everyone has some kind of relation to the joysticks used in e.g. flight or car simulators. Most joysticks do not only enable some kind of steering, but also generate haptic feedback often in the form of vibrations or resistance to motion. If we take an ordinary flight simulator joystick as an example, the player can experience heavy vibrations when the plane is stalling, as a kind of warning that the lift is beginning to decrease.

During recent years new input devices have been developed with the potential to really change the way we experience different kinds of games. I have already introduced the Phantom Omni in earlier posts – a device that makes it possible to not only feel texture, stiffness, friction, etc., but also to lift and move around virtual objects. This clearly opens up new possibilities for game development, especially since the Novint Falcon (picture above) started to spread. As far as I can understand haptic feedback is, in the vast majority of games where this kind of feedback is utilized, still limited to vibrations and resisting forces despite the fact that modern devices greatly widen the possibilities. Below, I will add a few thoughts about what can be done to utilize the unique aspects of haptic feedback in games. There are, of course, many more things you can do apart from the ones discussed here.

Imagine, e.g. a haptic game where the player not only has to worry about navigating to the right place and/or interacting with different objects, but also need to watch out for deadly magnetic wells “physically” pulling the game avatar towards them. That would certainly add a unique dimension to a game, as would magnetic “guides” pulling the user in a certain direction making him/her aware that e.g. an object is approaching. Every year students are creating simple games, based on magnetic objects which should be avoided, in the haptics course at KTH. Here is an example video from a simple game where the user need to navigate through a mine field to find a treasure! It is easy to add more levels and objects, so the game is quite scalable and the idea can be applied to many different scenarios. Another game from another course round used a similar idea – that you should avoid being dragged into objects – but in that case the objects had different widths and were moving from right to left. The user should stay clear of the objects for as long as possible.

There are many games out there today which are based on the exploration of large and different environments. Zelda and the Final Fantasy series are among the most known examples. In those kinds of games haptic feedback could also add an interesting dimension, when it comes to categorizing objects and/or explore occluded areas hidden behind or within buildings, trees or cliffs. In these kinds of games you still need ordinary input controllers, of course, but a haptic device could be used as a complement. Imagine that you walk around in a large virtual environment and come to a well which you cannot go down into. You could then switch to a haptic mode and send down a probe to feel what is at the bottom. If something is down there you could also pick it up. You could even take this further and have small puzzles in hidden places (like in the well example), where you need to feel differences between e.g. friction, surface texture and/or weight of different objects. If you place the objects in the correct order you could unlock some secret.

Haptic feedback could also be used a lot more in puzzle and maze games – there are quite a few of them out there today. If you add a haptic feedback dimension to a puzzle game you can e.g. use weight and texture of different pieces as additional input. A haptic-only puzzle would be very interesting to try out! You can also play around with haptic mazes and use friction, texture and maybe even magnetic forces to provide additional information about where you are, provided that you cannot see your own location. Quite a few projects in the haptics course have been based on haptic mazes.

Above, I have sketched on a few ideas on how one can utilize some unique aspects of haptic feedback in games. Since we already have the technology, I think it is important that we try to take a step further from games where haptic feedback is limited to vibration, resistance and indications of getting shot at and instead look at more creative ways to use haptic feedback. There are some creative solutions out there today, but I think many games could still benefit from using e.g. the ideas discussed above!