games · Haptics · Multimodality · sonification

Paper on the effect of auditory feedback on gaze behaviour accepted to SMC 2017!

SMC_accept

Earlier this week I wrote about a paper that was accepted to the Frontiers in Education (FIE) 2017 conference, but the fact is that yet another paper which I co-authored was accepted to another conference, Sound and Music Computing (SMC) 2017, earlier in May! Emma Frid (lead author), Roberto Bresin and Eva-Lotta Sallnäs Pysander from the department of Media technology and Interaction Design at the Royal Institute of Technology (KTH) are the other authors on that paper. The title of the SMC paper is “AN EXPLORATORY STUDY ON THE EFFECT OF AUDITORY FEEDBACK ON GAZE BEHAVIOR IN A VIRTUAL THROWING TASK WITH AND WITHOUT HAPTIC FEEDBACK”.

The paper is based on a small part of an extensive study, focusing on the effect of haptic and audio feedback on perception of object qualities and visual focus, performed a few years ago. In this particular paper we use eye-tracking metrics to investigate if auditory feedback in particular affects gaze behaviour in an environment where the task is to pick up a ball and throw it into a target area. We looked at both the effect of sound in general and effects of different sound models. Like in many other studies we have been involved in, conditions with different modality combinations were compared against each other. I will write more about the results when the paper has been presented and there is a link to the published proceedings. Search for the title given above if you want to find the specific session and listen to Emma’s presentation at the conference!

Here is the abstract, summarizing the main points:

This paper presents findings from an exploratory study on the effect of auditory feedback on gaze behavior. A total of 20 participants took part in an experiment where the task was to throw a virtual ball into a goal in different conditions: visual only, audiovisual, visuohaptic and audiovisuohaptic. Two different sound models were compared in the audio conditions. Analysis of eye tracking metrics indicated large inter-subject variability; difference between subjects was greater than difference between feedback conditions. No significant effect of condition could be observed, but clusters of similar behaviors were identified. Some of the participants’ gaze behaviors appeared to have been affected by the presence of auditory feedback, but the effect of sound model was not consistent across subjects. We discuss individual behaviors and illustrate gaze behavior through sonification of gaze trajectories. Findings from this study raise intriguing questions that motivate future large-scale studies on the effect of auditory feedback on gaze behavior.

As was the case with the FIE paper mentioned earlier, the SMC paper is just presenting a small part of a large study, so there is definitely a lot more to tell about the study and the different parameters measured. I will return to the overall study as soon as more papers are out!  🙂

 

communication · Group work · Haptics

More about my work with haptic communicative functions in collaborative virtual environments

img_1120

I’m currently in the process of writing a job application for an associate professorship and to make sure I don’t miss anything I recently browsed through my old conference folders to find the articles to append. When I came to EuroHaptics 2010 I was reminded about something I had completely forgotten – I was actually one of the organizers behind a workshop at that conference! I spent quite a lot of time preparing for the workshop, which focused on haptic communicative functions, but the weekend before the conference I got very sick and was forced to cancel my participation. I will take this opportunity to briefly introduce the workshop and discuss some of my work prior to it. My earlier blog posts on haptic feedback as an interaction modality were the following:

It was a shame that I could not attend the workshop/conference in Amsterdam, since it was based on my work on collaboration in multimodal environments up to that point. Thus, this was the perfect opportunity to discuss the work performed and get input regarding future work in the area. We described the focus of the workshop in the following way:

In this workshop, concrete examples will be presented and discussed in terms of how the touch modality can support communication and collaboration. Also, the technical challenges of distributed haptic feedback will be addressed. The target audience of the workshop is researchers and practitioners focusing on haptic feedback supporting people in settings where more than one user are involved. We invite other researchers and practitioners to share their research and experience from their different projects focussing specifically on the collaborative perspective. It might be that the collaborative aspects in your project have not yet been addressed. In that case, interesting collaborative aspects can be identified during the discussions in this workshop.

Quite a lot of work was performed by me and my “multimodal colleagues” 🙂 prior to the workshop. First of all, I had performed my master’s thesis work back in 2006 which focused on collaboration between visually impaired and sighted pupils in elementary school. Evaluations were performed in schools, where visually impaired and sighted pupils collaborated in dynamic collaborative environments where objects could be moved. During that work, and especially during a re-analyses performed during my first year as a Ph.D. student (2008) I realized that communicative functions based on haptic feedback had a real potential both when it came to supporting collaborative work and supporting inclusion of the visually impaired pupils in group work with sighted peers. It became especially clear that haptic functions for guiding (holding on to the same object or holding on to the peer’s proxy) can replace verbal guidance to a large extent.

Imagine a situation where you need to guide a visually impaired pupil to a particular place in a virtual environment. If you only have the visual feedback to rely on when establishing a common frame of reference, you need to talk a lot like  “go down, more down, …no, too much, go back…, now to the right…no, not down, up again… here it is!”. If you have haptic feedback available you can just grab the other person’s proxy and move the visually impaired peer to the right place and just say “here”. Needless to say, haptic feedback affects the dialogue between collaborators in this case. If you want to learn more about this explorative study you can read the journal article we finalized a few years after the study.

One problem that was evident from the evaluations with the visually impaired and sighted pupils was that the visually impaired pupil was not aware about what the sighted pupil did when haptic guiding was not utilized. This is why we performed a follow-up study where we added sound cues to the above mentioned dynamic interface to provide feedback on actions taken in the interface (e.g. grasping and putting down objects). We compared the new visual/haptic/audio version to the original visual/haptic one. We managed to show that the dialogue between the collaborators differed depending on which program version they worked in and it was also clear that the work was more effective (measured as time to complete task) in the visual/haptic/audio version. Once again, we could also clearly see how access to haptic feedback influenced the communication. You can read more about this study in this article.

These two studies resulted in a theoretical conference paper presented at HAID (Haptic and Audio Interaction Design) 2009, where we tried to develop a kind of conceptual model regarding haptic communicative functions’ effects on the dialogue between collaborators. This was my first attempt at a meta-analysis of the insights gained within this research area. The paper summarizes and discusses all the effects on the dialogue I had seen in my studies thus far. The paper made quite an impression – it is still the most referenced of all the papers I have produced up until today! At that point we were still quite unique when it came to haptic collaborative environments and I still think I’m one of the very few researchers who study the effect of haptic and audio feedback on the dialogue between collaborators.

The HAID conference paper laid the ground work for the workshop described in the beginning of this post and during the workshop the idea to study collaboration between two sighted persons was introduced and discussed. Next time I write about my earlier work I will introduce my latest study on collaborative multimodal interfaces, that showed that haptic and audio feedback indeed have effects on the dialogue between sighted persons as well!

 

deafblindness · Grant application · Haptics · sonification

Recently submitted my first ever research grant application!

One thing I have not mentioned in this blog before is that I’m one of the researchers behind the newly started network “Nordic Network on ICT and Disability”. This network gathers researchers from universities in Sweden and Denmark, focusing specifically on technology support for people with deafblindness. The reason why I’m a part of the network is primarily that I have developed some multimodal interfaces (based on haptic and audio feedback) for collaboration between sighted and visually impaired pupils in primary school (you can read this article and this conference proceeding for a summary of that work).

I have been thinking about writing a research grant proposal with a group of researchers belonging to the above mentioned network, ever since I joined it. And this year it finally happened! 😀  During a grant club in the middle of March, where several researchers from my division at Uppsala University gathered to write research grant proposals for a day, I ended up with a draft which felt close enough (read more about the very well organized grant club here). The draft was used as basis for discussion in a Skype-meeting with some other members of the network, after which we finally ended up with a research plan everyone felt comfortable with. It was submitted to the Swedish Research Council. I wrote most of the text, but it would never have worked without all the valuable input I got from my colleagues (most of them also co-applicants) both in the form of comments and addition of text chunks.

The proposed research focuses mainly on haptic feedback and how it can be used to support pupils with deafblindness in collaboration with sighted pupils – thus the focus is quite close to the research with visually impaired pupils which I, and several of the other co-applicants, were working on before.

The co-applicants are:

Apart from the above mentioned co-applicants, Charlotte Magnusson ([Research gate]) is also a part of the proposed project as a resource person from CERTEC.

I really believe in this team, since we complement each other in a very good way and we also belong to universities in Sweden which are in the top regarding research on assistive technologies and collaborative haptics. We of course hope the project will be funded, but in case it is not I really hope this team gathers again in search for other possible grants!

communication · Haptics

Haptic communicative functions

thesis

This is the fifth post in my blog series about haptics as an interaction modality and this time I start focusing on my main area – collaboration in multimodal interfaces. In earlier posts I have written about:

Ever since I started working with my master’s thesis project back in 2005 I have been focusing on collaboration in multimodal interfaces and specifically on how one can design haptic functions for collaboration and communication between two users working in the same environment. Collaborative haptic interfaces has been around for quite some time – one of the first examples is an arm-wrestling system using specialized hardware enabling two users to arm wrestle over a distance. Other commonly known, early, examples enabling a kind of mediated social touch are HandJive and InTouch. One thing these early systems have in common is that they use specialized hardware which is quite limited in scope. More examples of early systems and in-depth discussions about mediated social touch can be found in this excellent review.

During the two recent decades more widely applicable collaborative haptic functions for collaboration in virtual environments have been developed. Such functions can e.g. enable two users to feel each other’s forces on jointly held objects or make it possible to “shake hands” by utilizing magnetic forces between the two users’ proxies in the virtual environment. One of the earliest examples of an environment supporting these kinds of collaborative haptic functions, or guiding functions as I use to call them, is the collaborative text editor developed by Oakley et al.. Apart from the obvious functions needed to edit a document each user could also use a Phantom device to find positions of and/or communicate with the co-authors. An example of a haptic function was a kind grabbing function (similar to the shake hands function mentioned above), making it possible to grab another user’s proxy moving it to another part of the document. Other examples were a kind of “locate” function dragging one’s own proxy to another user’s proxy by a constant force or a “come here” function dragging another user’s proxy to one’s own position.

Later examples of virtual environments enabling these kinds of haptic collaborative functions are a collaborative drawing application developed at CERTEC, Lund University and applications for joint handling of objects evaluated by my former Ph.D. supervisor Eva-Lotta Sallnäs Pysander during her years as a Ph.D. student (thesis link). The latter examples are most relevant for me, since much of my work during my period as a Ph.D. student at KTH focused on collaborative interfaces in which two users can work together to move and place virtual objects. I will get back to my own applications later on, in a blog post about haptic interfaces supporting collaboration between visually impaired and sighted persons.

Above, I have just provided a few examples of collaborative haptic functions which can be used to control the forces provided by one or more haptic devices. The functions I think are most interesting to explore are the ones that enable physical interaction between two haptic devices in that both users can feel each other’s forces on jointly held objects or feel each other’s forces when holding on to each other’s proxies. These kinds of functions enable interesting means of communicating physically in virtual environments, especially in cases in which the users are not able to talk to each other face-to-face or point on the screen. Imagine, e.g. a scenario in which two users are exploring different parts of a complex multimodal interface showing distributed data clusters (what those clusters represent is not of importance here). In such an interface it would be very cumbersome to try to describe to the other person where a certain interesting cluster has been located. In this case the user who wants to show something s(he) found can grab the other user’s proxy and drag him/her to the relevant cluster. This possibility would probably simplify communication about the explored dataset (explaining where you can find details in a complex interface can be extremely cumbersome). This is, of course, just a made up example but it can be applied to many scenarios especially in cases where important parts of an interface are visually occluded. I will get back to joint handling of objects in a later blog post in this series. I will discussion the potential of using haptic feedback when exploring huge datasets in one of the upcoming blog posts.

Last, it is interesting to contrast functions enabling physical interaction between two haptic devices with functions only enabling a one-way communication (like the “goto”-function mentioned above). Using a one-way function enable some kind of communication in that one person’s proxy is “dragged” to another one’s, but the haptic function is only applied to one of the users in this case – there is e.g. no way for the other user to tell if the one being dragged actually wants to be dragged. When using a two-way haptic communicative function both users can feel forces from each other enabling a richer communication. Apart from enabling joint handling of objects, where the coordination of movement is made possible by both users feeling the other one’s forces on the jointly held object, two-way haptic communicative functions make it possible to e.g. clearly communicate to the other user that you do not want to be dragged somewhere. The potential these functions can have in situations where visually impaired and sighted users collaborate in virtual environments will be the topic of my next post in this series!

 

games · Haptics

Haptic feedback in games

falcon

Now it’s time for the forth post in my blog series about haptics as an interaction modality. In this post, I will write about games – an area where I think haptic feedback can be used in a much greater extent than it is today. The earlier posts in this blog series were:

Haptic feedback has been used in games for quite some time. I think that everyone has some kind of relation to the joysticks used in e.g. flight or car simulators. Most joysticks do not only enable some kind of steering, but also generate haptic feedback often in the form of vibrations or resistance to motion. If we take an ordinary flight simulator joystick as an example, the player can experience heavy vibrations when the plane is stalling, as a kind of warning that the lift is beginning to decrease.

During recent years new input devices have been developed with the potential to really change the way we experience different kinds of games. I have already introduced the Phantom Omni in earlier posts – a device that makes it possible to not only feel texture, stiffness, friction, etc., but also to lift and move around virtual objects. This clearly opens up new possibilities for game development, especially since the Novint Falcon (picture above) started to spread. As far as I can understand haptic feedback is, in the vast majority of games where this kind of feedback is utilized, still limited to vibrations and resisting forces despite the fact that modern devices greatly widen the possibilities. Below, I will add a few thoughts about what can be done to utilize the unique aspects of haptic feedback in games. There are, of course, many more things you can do apart from the ones discussed here.

Imagine, e.g. a haptic game where the player not only has to worry about navigating to the right place and/or interacting with different objects, but also need to watch out for deadly magnetic wells “physically” pulling the game avatar towards them. That would certainly add a unique dimension to a game, as would magnetic “guides” pulling the user in a certain direction making him/her aware that e.g. an object is approaching. Every year students are creating simple games, based on magnetic objects which should be avoided, in the haptics course at KTH. Here is an example video from a simple game where the user need to navigate through a mine field to find a treasure! It is easy to add more levels and objects, so the game is quite scalable and the idea can be applied to many different scenarios. Another game from another course round used a similar idea – that you should avoid being dragged into objects – but in that case the objects had different widths and were moving from right to left. The user should stay clear of the objects for as long as possible.

There are many games out there today which are based on the exploration of large and different environments. Zelda and the Final Fantasy series are among the most known examples. In those kinds of games haptic feedback could also add an interesting dimension, when it comes to categorizing objects and/or explore occluded areas hidden behind or within buildings, trees or cliffs. In these kinds of games you still need ordinary input controllers, of course, but a haptic device could be used as a complement. Imagine that you walk around in a large virtual environment and come to a well which you cannot go down into. You could then switch to a haptic mode and send down a probe to feel what is at the bottom. If something is down there you could also pick it up. You could even take this further and have small puzzles in hidden places (like in the well example), where you need to feel differences between e.g. friction, surface texture and/or weight of different objects. If you place the objects in the correct order you could unlock some secret.

Haptic feedback could also be used a lot more in puzzle and maze games – there are quite a few of them out there today. If you add a haptic feedback dimension to a puzzle game you can e.g. use weight and texture of different pieces as additional input. A haptic-only puzzle would be very interesting to try out! You can also play around with haptic mazes and use friction, texture and maybe even magnetic forces to provide additional information about where you are, provided that you cannot see your own location. Quite a few projects in the haptics course have been based on haptic mazes.

Above, I have sketched on a few ideas on how one can utilize some unique aspects of haptic feedback in games. Since we already have the technology, I think it is important that we try to take a step further from games where haptic feedback is limited to vibration, resistance and indications of getting shot at and instead look at more creative ways to use haptic feedback. There are some creative solutions out there today, but I think many games could still benefit from using e.g. the ideas discussed above!

 

DOME · eHealth · Haptics

Looking back at 2016

farjan

I will soon continue on my blog series about haptics as an interaction modality, but before I do I will take this opportunity to look back at 2016 before the new year begin. A lot of things have happened this year and quite a few of them have been unexpected. Before I continue I just want to point out that the image above, not in any way connected to my teaching or research, is taken by me in spring 2016. It’s there just because we are entering a new year and the image represent a kind of natural fire work.

The year started with some teaching and research at KTH within the scope of my newly started consulting company Jonas Moll Consulting. The teaching mostly concerned work within a course about haptics. I assisted in developing parts of that course back in 2011 and I have been a part of the course ever since. I usually grade individual literature reports, supervise labs, assess the technical heights of projects and hold a few lectures. I have really enjoyed working with this course and I hope I will be able to at least hold a few guest lectures on it next year as well. The research performed during spring mostly concerned thorough analysis of results and writing manuscripts to be submitted to journals. I will write more about the studies when the manuscripts have gone through the review process and been transformed into published articles. A small selection of preliminary results can, however, be found here (about interaction between vision, touch and hearing in multimodal virtual environments) and here (about using Twitter in a university course).

In May I got an email from a colleague about a new postdoc position, focusing on eHealth, at Uppsala University. It sounded very interesting since I had been working in a medical project before (collaboration between KTH and Karolinska Institutet) so of course I applied. A few weeks later I was called to an interview and quite soon afterwards it was confirmed that I got the postdoc job!  🙂

I started my work as postdoc 1/9 (at 70%, since I had accepted to work as a consultant in a course on communication before I knew about the postdoc position) and a lot has happened since then.

One of the first things I did was to start up two studies – one interview and survey study with physicians and nurses and one national survey study (the results had already been gathered, but analysis and reporting remained) – and invite researchers to join in. It turned out that quite a few wanted to be a part of the studies so now I’m leading two studies with 10 researchers in each. To complicate things further, these researchers are distributed among no less than six universities (one in Germany)! Almost all material is prepared for the interview/survey study, so we should be able to start quite early in spring after the ethical review. An initial visual inspection of the data from the national patient survey have already brought forward some interesting results and the next step is to bring everything into SPSS for detailed analysis. I will write more about the results from and progress of these studies later on.

Before I started my work at Uppsala University I had not been to that many conferences – my focus had been to write journal articles. That situation changed quite abruptly during this autumn. I visited three conferences during a period of two months! Two of them were held in Gothenburg and one in Oslo. The first conference I went to was the SweCog 2016 conference in Gothenburg. I wrote three blog posts about it (summary, Day 1, Day 2). I did not present anything during that conference. The second one was the NordiCHI conference in Gothenburg. This time I took part in a workshop and I did it from a patient’s perspective! This was the first time I could, in my research, use the fact that I have a chronic rheumatical disease. The research contribution can be found here for those who are interested and I also wrote a blog post about it. Here you can also find my overall summary of the conference. The last conference I attended was the EHiN (EHealth in Norway) conference in Oslo. This time I was one of the organizers behind a workshop, in which I again participated as a patient! The role play activity I took part in is summarized here and you can also find a summary of the whole conference here.

Other new things that happened to me during the last autumn were that I got the chance to be an invited speaker and that was interviewed for a podcast. I will write another blog post about these related events when the podcast has been made available online. In December I also joined a group who will work on a new EU proposal. Exciting, to say the least! During autumn I also, for the first time, took part in going through applications and interviewing persons who had applied to a new Ph.D. position.

So, quite a lot has happened since I woke up 1/1 2016, to say the least! I’m quite sure 2017 will also be an exciting year. I know that I will be doing some teaching in courses at Uppsala University, that we will conduct large studies and get numerous articles as a result and that I will write on applications and organize some workshops, but looking back at the last autumn I’m sure that quite a few surprises will pop up as well!  🙂

 

Happy New Year everyone!

 

Group work · Haptics

Haptic interaction research: one great memory!

Since Christmas is coming up I will in this blog post break my series on haptic interaction and talk about a specific memory from my research (rather than discussing an application area in general). The memory I’m presenting here comes from evaluations performed in schools several years ago.

The evaluations were performed within the scope of an EU project (MICOLE), which aimed at developing tools for supporting collaboration between sighted and visually impaired pupils in elementary school classes. One of the major problems today is that sighted and visually impaired pupils use different work materials which are not easily shared, sometimes causing them to do “group work” in parallel rather than together. By developing environments based on haptic feedback, which could be accessible to both sighted and visually impaired pupils, the project tried to address this important problem.

One of my contributions to the project was a 2D (or rather 2.5D) interface presenting a virtual whiteboard on which geometrical shapes and angles were drawn (felt as raised lines). Two haptic devices were connected so that one sighted and one visually impaired pupil could be in the virtual environment at the same time. The tasks, during the evaluations in schools, were to go through the shapes and angles respectively together and categorize them. These kinds of tasks were chosen since we knew that the pupils in the schools where we performed the evaluations had just started to learn geometry.

When we came to one of the schools we soon realized that the pupils had just started to learn geometrical shapes, but they had never talked about angles – they could not discriminate between right, acute and obtuse angles. Therefore we needed to explain how to discriminate between the different types of angles before the evaluation started. When a task presenting 10 different angles (some right, some obtuse and some acute), forming an irregular star-shaped figure, had been loaded the visually impaired pupil took a few minutes to explore the interface in order to locate the angles in the figure. After this, the pupil moved directly to one of the angles and categorized it, correctly, as acute – something that also the sighted pupils agreed on. They then browsed through the angles, one by one, always focusing on the angle the visually impaired pupil was currently “touching”. They always came up with the right answers!

I really enjoyed watching this group working in the interface. It was so obvious that the application could be used as a shared interface for discussing a concept which was new to everyone. It was equally obvious that the visually impaired pupil, after just a few minutes of exploration, knew exactly how to navigate to the different angles in the star-shaped figure (without having to follow all edges!). This enabled the visually impaired pupil to take and keep the initiative during the discussions! Thus, the interface clearly, at least in this particular case, enabled inclusion of the visually impaired pupil in the group work. Watching this group work in the interface, and especially seeing the visually impaired pupil leading the discussion, is definitely one of the greatest memories I have from my research on haptic interaction.

I will get back to this study, and related studies, in a later blog post related to communication in haptic interfaces. Those who are curious about the study I briefly introduced above can also read this article.

 

Merry Christmas to everyone!