Haptics · Human-Computer Interaction · Multimodality · sonification · Thesis defense

Recently attended Emma Frid’s thesis defense at KTH!

Disputationsbild_Emma

On Friday, January 10, I attended Emma Frid’s thesis defense at KTH. Emma and I collaborated in a research project a few years ago, and one of the major outcomes was this open access article presenting the results of an experiment with a multimodal interface including both haptic feedback and two different sonification models. Emma’s thesis work relates heavily to the research field of Sound and Music Computing (also the name of a sub-group at the department of Media technology and Interaction Design at KTH where I worked for more than a decade), and focuses specifically on (accessible) digital music instruments and interfaces. The main research question is “How can music interfaces be designed for inclusion?”. The thesis “Diverse Sounds – Enabling Inclusive Sonic Interaction” can be found here. The main supervisor was professor Roberto Bresin and co-supervisor was professor Eva-Lotta Sallnäs Pysander. Both of them work at the department of Media technology and Interaction Design at KTH. The opponent was Reader Andrew McPherson from the school of Electronic Engineering and Computer Science at Queen Mary University of London. The examination committee consisted of senior researcher Elaine Chew from the French National Centre for Scientific Research and the Music Representations Team at the Institute for Research and Coordination in Acoustics/Music, professor Rolf Inge Godøy from the Centre for Interdisciplinary Studies in Rhythm, Time and Motion at University of Oslo, associate professor Dan Overholt from the department of Architecture, Design and Media Technology at Aalborg University and professor Henrik Frisk from the department of Composition, Conducting and Music Theory at Royal College of Music in Stockholm.

After the introduction by associate professor Madeline Balaam, who chaired the event, the opponent held a presentation about Emma’s thesis for about 45 minutes. It was a very good presentation and it was interesting to listen to his interpretation of the work performed. He concluded his presentation by discussing the nine properties that according to Emma should be considered when designing accessible digital music instruments; expressiveness, playability, longevity, customizability, pleasure, sonic quality, robustness, multimodality and causality (see the thesis for a thorough coverage of these properties and the work that gave rise to them). After a short break the opponent, and later on the members of the committee, asked question that formed a good foundation for interesting discussions about the thesis work.

I think Emma did a really good job answering the questions and discussing her work. She elaborated a lot on the themes that were brought up to discussion and it was very clear that she knows a lot about this research field. She was also calm during the entire process and even helped out when the opponent and committee members e.g. needed headsets and/or microphones. One thing that was special about this defense is that the opponent, as well as all members of the grading committee, began their round of questions by congratulating Emma on the excellent job that she has performed! I have not seen that during other defenses I have attended. The defense was rounded off by a very long applaud – it was almost as if the audience expected some kind of extra performance on stage.  🙂

communication · design · Group work · Haptics · Human-Computer Interaction · Multimodality · sonification

Overview of my research within multimodal interaction

thesis

In my last blog post I presented an overview about my research within the eHealth domain. In this blog post I will do the same thing, but for my other main research field – multimodal interaction in virtual environments.

 

What have I done related to multimodal interaction?

Even though I have spent the last couple of years focusing mainly on eHealth, I have done a lot of research – especially as a Ph.D. student at the Royal Institute of Technology – related to multimodal interaction. Most of this research has been focused on multimodal learning environments for collaborative task solving between sighted and visually impaired persons. Haptic feedback has played a major part in the collaborative virtual environments that I have designed and evaluated both in lab settings and in the field in e.g. primary schools. Quite a while ago, I wrote a blog series on haptic feedback focusing on the work I performed within the scope of my doctoral studies. Here are the links to those posts:

During my time as a postdoc at Uppsala University, I also performed some activities related to multimodal interaction. Most of this time I devoted to research grant applications and I also wrote a few conference papers. You can read a short summary of these activities here.

In total, my research on multimodal interaction has, up until today, resulted in the following five journal publications (some links lead to open access publications or pre-prints):

and the following 11 conference papers (some links leads to open access publications or pre-prints):

 

My ongoing research within multimodal interaction

Currently, there is not much going on related to this research field (at least not in my own research). The only ongoing activity I’m engaged in is an extensive literature review related to communication in collaborative virtual environments which will lead to a theoretical research article where I will discuss different technical solutions for haptic communication in the light of the research I have performed within the area up until today. I’m collaborating with my former Ph.D. supervisor Eva-Lotta Sallnäs Pysander on this activity. I hope that this research activity will help me in my continued research on collaboration between visually impaired and sighted pupils based on different types of tasks and learning material.

Upcoming research on multimodal interaction

As I wrote in a recent blog post multimodal interaction, with a focus on haptic feedback, seems to be a new research area at the Centre for empirical research on information systems (CERIS) where I just stared my assistant professorship. Thus, this is the research area in which I can contribute with something new to the department. An area that is already represented at the department, however, is “Information Technology and Learning”, which seems to be a perfect fit in this case!

Last year, I also submitted a research grant application focusing on continued work with collaborative multimodal learning environments. Unfortunately, that one was rejected but no one is giving up. I will work somewhat on revising the application during the autumn and submit as soon as a suitable call pops up. Maybe I will also have additional co-applicants from the CERIS department by then.

conference · Haptics · Multimodality · sonification

Got a new paper published, on the effects of auditory and haptic feedback on gaze behaviour!

SMC_published

About a month ago I wrote a blog post about a conference paper with the title “AN EXPLORATORY STUDY ON THE EFFECT OF AUDITORY FEEDBACK ON GAZE BEHAVIOR IN A VIRTUAL THROWING TASK WITH AND WITHOUT HAPTIC FEEDBACK” that had just been accepted for the Sound and Music Computing 2017 conference. Now, that paper has been formally published! You can find our paper here and the full conference proceedings here. The study leader, Emma Frid presented the paper last Thursday (6/7 2017) afternoon in Espoo, Finland. The other authors are Roberto Bresin, Eva-Lotta Sallnäs Pysander and I.

As I wrote in the earlier blog post, this particular paper is based on a small part of an extensive experiment. The experiment, which 20 participants took part in, was based on a simple task – picking up a ball and throwing it into a goal area at the opposite side of a virtual room. After 15 hits the task had been solved. The same task was solved in several different conditions of which some included haptic rendering and some included movement sonification (two different sound models were compared) of the throwing gesture. During all interaction with the interface, different parameters, including gaze data collected through an eye-tracker, were continuously logged. In the part of the experiment on which the published paper is based we wanted to find out if the participants’ visual focus in the interface changed depending on experiment condition (e.g. if participants looked more at the goal when haptic and/or auditory feedback was presented). Due to bad quality of the sampled gaze data for some of the participants (< 80% of the gaze points had been registered), only gaze data from 13 participants could be used in the analysis.

Much due to large inter-subject variability, we did not get any significant results this time around, but some interesting patterns arose. Results e.g. indicated that participants fixated fewer times on the screen when solving the task in visual/audio conditions compared to a visual-only condition and fewer times on the screen when solving the task in the visual/haptic/audio conditions than when doing it in the visual/haptic condition. The differences between haptic conditions were, however, small especially regarding one of the sound models presenting a swishing sonification of the throwing gesture. When considering total fixation duration (for how long the participants focused on the screen) the tendency was that participants focused less on the screen when this sound model was used (indications were stronger when haptic feedback was not provided). Even though these results were not significant they indicate that movement sonification has an effect on gaze behaviour. When looking at gaze behaviour for each participant individually we could also see that the participants could be divided into a few clusters in which the participants showed similar behaviour. Although the large inter-subject variability did not make it possible to find any general patterns, we could find indications of effects of auditory feedback within the clusters. See the article linked above, for a more detailed analysis, illustrations and discussion.

Even though we did not get any significant results, the indications we got that movement sonifications can affect visual focus are still interesting. If it is true that you look more on the screen when you do not have access to movement sonification, this can mean that you can focus on different parts of an interface, maybe solving different tasks in parallel, when having access to movement sonification in this kind of environment. It is definitely worth conducting similar studies with a lot more participants in order to see if the indications we got would become significant. Experiments with more users could also show if participants focus more on the goal when having access to movement sonification and/or haptic feedback – if so, this would indicate that the information provided by haptic and audio feedback, respectively, is enough to understand that you are performing an accurate throwing gesture (you don’t need to look at the ball to confirm it). Results from interviews held at the end of the test sessions already indicate this!

This is the very first paper Eva-Lotta and I have gotten accepted to the Sound and Music Computing conference. Emma and Roberto, however, have gotten papers accepted to that conference numerous times. Check out their Researchgate accounts for their earlier contributions to this conference and so much more related to e.g. sound design.

games · Haptics · Multimodality · sonification

Paper on the effect of auditory feedback on gaze behaviour accepted to SMC 2017!

SMC_accept

Earlier this week I wrote about a paper that was accepted to the Frontiers in Education (FIE) 2017 conference, but the fact is that yet another paper which I co-authored was accepted to another conference, Sound and Music Computing (SMC) 2017, earlier in May! Emma Frid (lead author), Roberto Bresin and Eva-Lotta Sallnäs Pysander from the department of Media technology and Interaction Design at the Royal Institute of Technology (KTH) are the other authors on that paper. The title of the SMC paper is “AN EXPLORATORY STUDY ON THE EFFECT OF AUDITORY FEEDBACK ON GAZE BEHAVIOR IN A VIRTUAL THROWING TASK WITH AND WITHOUT HAPTIC FEEDBACK”.

The paper is based on a small part of an extensive study, focusing on the effect of haptic and audio feedback on perception of object qualities and visual focus, performed a few years ago. In this particular paper we use eye-tracking metrics to investigate if auditory feedback in particular affects gaze behaviour in an environment where the task is to pick up a ball and throw it into a target area. We looked at both the effect of sound in general and effects of different sound models. Like in many other studies we have been involved in, conditions with different modality combinations were compared against each other. I will write more about the results when the paper has been presented and there is a link to the published proceedings. Search for the title given above if you want to find the specific session and listen to Emma’s presentation at the conference!

Here is the abstract, summarizing the main points:

This paper presents findings from an exploratory study on the effect of auditory feedback on gaze behavior. A total of 20 participants took part in an experiment where the task was to throw a virtual ball into a goal in different conditions: visual only, audiovisual, visuohaptic and audiovisuohaptic. Two different sound models were compared in the audio conditions. Analysis of eye tracking metrics indicated large inter-subject variability; difference between subjects was greater than difference between feedback conditions. No significant effect of condition could be observed, but clusters of similar behaviors were identified. Some of the participants’ gaze behaviors appeared to have been affected by the presence of auditory feedback, but the effect of sound model was not consistent across subjects. We discuss individual behaviors and illustrate gaze behavior through sonification of gaze trajectories. Findings from this study raise intriguing questions that motivate future large-scale studies on the effect of auditory feedback on gaze behavior.

As was the case with the FIE paper mentioned earlier, the SMC paper is just presenting a small part of a large study, so there is definitely a lot more to tell about the study and the different parameters measured. I will return to the overall study as soon as more papers are out!  🙂

 

deafblindness · Grant application · Haptics · sonification

Recently submitted my first ever research grant application!

One thing I have not mentioned in this blog before is that I’m one of the researchers behind the newly started network “Nordic Network on ICT and Disability”. This network gathers researchers from universities in Sweden and Denmark, focusing specifically on technology support for people with deafblindness. The reason why I’m a part of the network is primarily that I have developed some multimodal interfaces (based on haptic and audio feedback) for collaboration between sighted and visually impaired pupils in primary school (you can read this article and this conference proceeding for a summary of that work).

I have been thinking about writing a research grant proposal with a group of researchers belonging to the above mentioned network, ever since I joined it. And this year it finally happened! 😀  During a grant club in the middle of March, where several researchers from my division at Uppsala University gathered to write research grant proposals for a day, I ended up with a draft which felt close enough (read more about the very well organized grant club here). The draft was used as basis for discussion in a Skype-meeting with some other members of the network, after which we finally ended up with a research plan everyone felt comfortable with. It was submitted to the Swedish Research Council. I wrote most of the text, but it would never have worked without all the valuable input I got from my colleagues (most of them also co-applicants) both in the form of comments and addition of text chunks.

The proposed research focuses mainly on haptic feedback and how it can be used to support pupils with deafblindness in collaboration with sighted pupils – thus the focus is quite close to the research with visually impaired pupils which I, and several of the other co-applicants, were working on before.

The co-applicants are:

Apart from the above mentioned co-applicants, Charlotte Magnusson ([Research gate]) is also a part of the proposed project as a resource person from CERTEC.

I really believe in this team, since we complement each other in a very good way and we also belong to universities in Sweden which are in the top regarding research on assistive technologies and collaborative haptics. We of course hope the project will be funded, but in case it is not I really hope this team gathers again in search for other possible grants!