conference · Haptics · Multimodality

Got two posters accepted to SweCog 2017!

SweCog_accept

In an earlier blog post I wrote about my preparations for the Swedish Cognitive Science Society (SweCog) 2017 conference. My plan was to submit at least two papers to that conference and that was exactly what I did. One of the papers, “Using Eye-Tracking to Study the Effect of Haptic Feedback on Visual Focus During Collaborative Object Managing in a Multimodal Virtual Interface” I wrote together with Emma Frid and the other “Haptic communicative functions and their effects on communication in collaborative multimodal virtual environments” I wrote together with Eva-Lotta Sallnäs Pysander. I was first author on both since I led the work and did most of the writing. Earlier this week I got two emails from the conference organizers confirming that both papers had been accepted as posters!

When Eva-Lotta and I submitted the papers (you could only submit one per person) we indicated that we were aiming for oral presentations, but they were both “downgraded” to posters after the reviews. When it comes to first paper, written with Emma, I can understand it, since we were reporting on a pilot study and there were quite a few papers submitted by other researchers which reported on full-scale experiments and evaluations. The other one, on haptic communicative functions, were more theoretical in nature and in that case I think the main problem was the 500 words limit – we couldn’t really elaborate enough on our main findings, when most of the space had to be used to define and explain haptic communicative functions. Anyhow, I’m very happy that the papers were accepted and that we will be able to discuss our work with others during the conference.

The second confirmation email, about that paper on haptic communicative functions, actually included an interesting twist – one of the reviewers of that paper recommended that the paper should be presented by means of a live-demo during the poster session! That really came as a surprise (a positive one) and the organizers were really willing to work with us to make the live-demo happen. Unfortunately, one problem is that the studies referenced in the paper (about an evaluation and an experiment, respectively, during which pairs of users were collaborating by means of haptic and audio communicative functions – see this and this preprint article) used virtual environments based on outdated API:s that no longer work. I’m not sure that I can implement the environments using the newer haptics API Chai3D in time for the conference. But, no matter what, will still have the poster and the possibility to discuss and explain our findings.

So, the only thing remaining now (apart from trying to get a demo working) is to create two informative posters. After the conference I will get back to this topic and elaborate some more on the work presented on the two posters, so expect more posts about SweCog 2017 and my contributions two it!

 

DOME · eHealth · Haptics · Medical applications · Medical Records Online · National patient survey · Summer school

Today I celebrate my blog’s first anniversary!

Ren1

Exactly one year ago I wrote my very first blog post! You can read that short post here. From the beginning my intention was to write two posts a week, but for different reasons my average during this first year is 1.4/week. During this first year the blog has had 2782 views by 1466 visitors. During the first couple of months the number of views were under 100, but I’m glad to see that the numbers have kept increasing – the number of views in September was 491, and 53 views have been accumulated during the first days of October this year.

Since this is a special blog post, the picture I chose is not by any means related to my work. Instead, I chose one of the pictures I took while hiking in northern Sweden (Abisko) about two years ago – I just love the nature up there!

As a kind of celebration, I will here present a top 5 list with the five most read posts:

  1. My colleague, Thomas Lind, successfully defended his thesis today!

This is one of my latest posts, which I uploaded in the middle of September. Despite the short time frame this is, by far, the most read post! The post is about the defense resulting in my colleague, Thomas Lind, getting a Ph.D. degree.

  1. EHealth summer school in Dublin, day 5

Those who have followed my blog during the latest months know that I have been writing quite extensively about a summer school I attended – one week in Dublin and one week in Stockholm. I’m very happy to see that one of those posts is on this list, because it took quite a lot of time to write them. This particular post is also a kind of summary post which includes links to the other posts about the week in Dublin. The summer school was a nice experience in so many ways and I really encourage you to read those posts if you are interested in eHealth/mHealth design (the Stockholm posts, although not on the top 5 lists, can be found here).

  1. A very successful session about patient accessible electronic health records at Vitalis 2017!

This post is not only on the top 5 list regarding views, but it is, by far, the most shared post on social media. The post summarizes an 1.5 hours session hosted by the DOME consortium at Vitalis last spring. I really hope we will get the opportunity to do something similar at Vitalis 2018! Read this post if you want to know about the state of art regarding patient accessible electronic health records in Sweden.

  1. The team behind a new large patient survey on electronic health records in Sweden!

I’m also happy to see this one on the list – since this post represents one of the big studies I’m currently leading. This particular study is based on a large national survey focusing on patients’ experiences with and attitudes towards the patient accessible electronic health system Journalen. In this post I introduce all researchers that work with the study.

  1. Haptic feedback in medical applications

The fifth most read post belongs to the blog series on haptic feedback as an interaction modality, which I started last autumn. This particular one concerns how haptic feedback can be utilized in medical applications. In this post I also introduce my own work within this area, which I carried out as a Ph.D. student at KTH.

So, these were my five most read posts, and I’m glad to see that they relate to different areas. The only area not covered in this list is pedagogical development. This might change during the next year, however, since I will most probably get an extensive study on Twitter as a communication medium in higher education courses published and I will also write a series of posts about a basic course in human-computer interaction which I will be responsible for at Uppsala University (starting October 30).

I have really enjoyed the blogging activity and will definitely continue to update this blog regularly, so stay tuned for more!  🙂

Cognition · conference · Haptics · Multimodality

Preparing submissions for the SweCog 2017 conference, held at Uppsala University!

SweCog2017_Uppsala

This week, I’m preparing submissions for this year’s version of the SweCog (Swedish Cognitive Science Society) conference. This conference covers a broad range of topics related to cognitive science. When I participated last year, when the conference was held at Chalmers, Gothenburg, I did not present anything (actually, none of the participants from Uppsala University did), but the situation this year is quite different since Uppsala University is hosting the event!

I really enjoyed last year’s conference much due to the large variety of topics covered and the very interesting keynote lectures. It was also (and still is, I assume) a single track conference, meaning that you will not have to choose which paper session to attend. As I remember there were ten paper presentations in total, three keynote lectures and one poster session during the two days conference. You can read more about my experiences from SweCog 2016 in this blog post, summing up that event. I also wrote summaries from day 1 and day 2.

Since the only thing that’s required is an extended abstract of 1-3 pages (and max 500 words), I’m working on several submissions. A topic that was not covered during last year’s conference was collaboration in multimodal environments and specifically how different combinations of modalities can affect communication between two users solving a task together. Since that is one of my main research interests, I now see my chance to contribute! The deadline for extended abstract submissions to SweCog 2017 is September 4, so there is still a lot of time to write. The conference will be held October 26-27 at Uppsala University. Since registration to the conference is free for SweCog members (membership is also free), I expect to see many of my old KTH colleagues at Uppsala University during the conference days! 😉  You can find more information about the conference here.

Before I started planning for contributions to SweCog 2017, I invited some of my “multimodal colleagues” from KTH to join the writing process. As a result, Emma Frid and I will collaborate on an extended abstract about a follow-up study to the study I present here. Thus, our contribution will focus on how multimodal feedback can affect visual focus when two users are solving a task together in a collaborative virtual environment. Since I have not yet heard from any other colleague, I plan to write another extended abstract on my own, about how multimodal feedback (or rather combinations of visual, haptic and auditory feedback) can affect the means by which users talk to each other while working in collaborative virtual environments. Maybe, I will also throw in a third one about the potential of using haptic guiding functions (see this blog post for an explanation of this concept) in situations where sighted and visually impaired users collaborate.

 

conference · Haptics · Multimodality · sonification

Got a new paper published, on the effects of auditory and haptic feedback on gaze behaviour!

SMC_published

About a month ago I wrote a blog post about a conference paper with the title “AN EXPLORATORY STUDY ON THE EFFECT OF AUDITORY FEEDBACK ON GAZE BEHAVIOR IN A VIRTUAL THROWING TASK WITH AND WITHOUT HAPTIC FEEDBACK” that had just been accepted for the Sound and Music Computing 2017 conference. Now, that paper has been formally published! You can find our paper here and the full conference proceedings here. The study leader, Emma Frid presented the paper last Thursday (6/7 2017) afternoon in Espoo, Finland. The other authors are Roberto Bresin, Eva-Lotta Sallnäs Pysander and I.

As I wrote in the earlier blog post, this particular paper is based on a small part of an extensive experiment. The experiment, which 20 participants took part in, was based on a simple task – picking up a ball and throwing it into a goal area at the opposite side of a virtual room. After 15 hits the task had been solved. The same task was solved in several different conditions of which some included haptic rendering and some included movement sonification (two different sound models were compared) of the throwing gesture. During all interaction with the interface, different parameters, including gaze data collected through an eye-tracker, were continuously logged. In the part of the experiment on which the published paper is based we wanted to find out if the participants’ visual focus in the interface changed depending on experiment condition (e.g. if participants looked more at the goal when haptic and/or auditory feedback was presented). Due to bad quality of the sampled gaze data for some of the participants (< 80% of the gaze points had been registered), only gaze data from 13 participants could be used in the analysis.

Much due to large inter-subject variability, we did not get any significant results this time around, but some interesting patterns arose. Results e.g. indicated that participants fixated fewer times on the screen when solving the task in visual/audio conditions compared to a visual-only condition and fewer times on the screen when solving the task in the visual/haptic/audio conditions than when doing it in the visual/haptic condition. The differences between haptic conditions were, however, small especially regarding one of the sound models presenting a swishing sonification of the throwing gesture. When considering total fixation duration (for how long the participants focused on the screen) the tendency was that participants focused less on the screen when this sound model was used (indications were stronger when haptic feedback was not provided). Even though these results were not significant they indicate that movement sonification has an effect on gaze behaviour. When looking at gaze behaviour for each participant individually we could also see that the participants could be divided into a few clusters in which the participants showed similar behaviour. Although the large inter-subject variability did not make it possible to find any general patterns, we could find indications of effects of auditory feedback within the clusters. See the article linked above, for a more detailed analysis, illustrations and discussion.

Even though we did not get any significant results, the indications we got that movement sonifications can affect visual focus are still interesting. If it is true that you look more on the screen when you do not have access to movement sonification, this can mean that you can focus on different parts of an interface, maybe solving different tasks in parallel, when having access to movement sonification in this kind of environment. It is definitely worth conducting similar studies with a lot more participants in order to see if the indications we got would become significant. Experiments with more users could also show if participants focus more on the goal when having access to movement sonification and/or haptic feedback – if so, this would indicate that the information provided by haptic and audio feedback, respectively, is enough to understand that you are performing an accurate throwing gesture (you don’t need to look at the ball to confirm it). Results from interviews held at the end of the test sessions already indicate this!

This is the very first paper Eva-Lotta and I have gotten accepted to the Sound and Music Computing conference. Emma and Roberto, however, have gotten papers accepted to that conference numerous times. Check out their Researchgate accounts for their earlier contributions to this conference and so much more related to e.g. sound design.

games · Haptics · Multimodality · sonification

Paper on the effect of auditory feedback on gaze behaviour accepted to SMC 2017!

SMC_accept

Earlier this week I wrote about a paper that was accepted to the Frontiers in Education (FIE) 2017 conference, but the fact is that yet another paper which I co-authored was accepted to another conference, Sound and Music Computing (SMC) 2017, earlier in May! Emma Frid (lead author), Roberto Bresin and Eva-Lotta Sallnäs Pysander from the department of Media technology and Interaction Design at the Royal Institute of Technology (KTH) are the other authors on that paper. The title of the SMC paper is “AN EXPLORATORY STUDY ON THE EFFECT OF AUDITORY FEEDBACK ON GAZE BEHAVIOR IN A VIRTUAL THROWING TASK WITH AND WITHOUT HAPTIC FEEDBACK”.

The paper is based on a small part of an extensive study, focusing on the effect of haptic and audio feedback on perception of object qualities and visual focus, performed a few years ago. In this particular paper we use eye-tracking metrics to investigate if auditory feedback in particular affects gaze behaviour in an environment where the task is to pick up a ball and throw it into a target area. We looked at both the effect of sound in general and effects of different sound models. Like in many other studies we have been involved in, conditions with different modality combinations were compared against each other. I will write more about the results when the paper has been presented and there is a link to the published proceedings. Search for the title given above if you want to find the specific session and listen to Emma’s presentation at the conference!

Here is the abstract, summarizing the main points:

This paper presents findings from an exploratory study on the effect of auditory feedback on gaze behavior. A total of 20 participants took part in an experiment where the task was to throw a virtual ball into a goal in different conditions: visual only, audiovisual, visuohaptic and audiovisuohaptic. Two different sound models were compared in the audio conditions. Analysis of eye tracking metrics indicated large inter-subject variability; difference between subjects was greater than difference between feedback conditions. No significant effect of condition could be observed, but clusters of similar behaviors were identified. Some of the participants’ gaze behaviors appeared to have been affected by the presence of auditory feedback, but the effect of sound model was not consistent across subjects. We discuss individual behaviors and illustrate gaze behavior through sonification of gaze trajectories. Findings from this study raise intriguing questions that motivate future large-scale studies on the effect of auditory feedback on gaze behavior.

As was the case with the FIE paper mentioned earlier, the SMC paper is just presenting a small part of a large study, so there is definitely a lot more to tell about the study and the different parameters measured. I will return to the overall study as soon as more papers are out!  🙂

 

communication · Group work · Haptics

More about my work with haptic communicative functions in collaborative virtual environments

img_1120

I’m currently in the process of writing a job application for an associate professorship and to make sure I don’t miss anything I recently browsed through my old conference folders to find the articles to append. When I came to EuroHaptics 2010 I was reminded about something I had completely forgotten – I was actually one of the organizers behind a workshop at that conference! I spent quite a lot of time preparing for the workshop, which focused on haptic communicative functions, but the weekend before the conference I got very sick and was forced to cancel my participation. I will take this opportunity to briefly introduce the workshop and discuss some of my work prior to it. My earlier blog posts on haptic feedback as an interaction modality were the following:

It was a shame that I could not attend the workshop/conference in Amsterdam, since it was based on my work on collaboration in multimodal environments up to that point. Thus, this was the perfect opportunity to discuss the work performed and get input regarding future work in the area. We described the focus of the workshop in the following way:

In this workshop, concrete examples will be presented and discussed in terms of how the touch modality can support communication and collaboration. Also, the technical challenges of distributed haptic feedback will be addressed. The target audience of the workshop is researchers and practitioners focusing on haptic feedback supporting people in settings where more than one user are involved. We invite other researchers and practitioners to share their research and experience from their different projects focussing specifically on the collaborative perspective. It might be that the collaborative aspects in your project have not yet been addressed. In that case, interesting collaborative aspects can be identified during the discussions in this workshop.

Quite a lot of work was performed by me and my “multimodal colleagues” 🙂 prior to the workshop. First of all, I had performed my master’s thesis work back in 2006 which focused on collaboration between visually impaired and sighted pupils in elementary school. Evaluations were performed in schools, where visually impaired and sighted pupils collaborated in dynamic collaborative environments where objects could be moved. During that work, and especially during a re-analyses performed during my first year as a Ph.D. student (2008) I realized that communicative functions based on haptic feedback had a real potential both when it came to supporting collaborative work and supporting inclusion of the visually impaired pupils in group work with sighted peers. It became especially clear that haptic functions for guiding (holding on to the same object or holding on to the peer’s proxy) can replace verbal guidance to a large extent.

Imagine a situation where you need to guide a visually impaired pupil to a particular place in a virtual environment. If you only have the visual feedback to rely on when establishing a common frame of reference, you need to talk a lot like  “go down, more down, …no, too much, go back…, now to the right…no, not down, up again… here it is!”. If you have haptic feedback available you can just grab the other person’s proxy and move the visually impaired peer to the right place and just say “here”. Needless to say, haptic feedback affects the dialogue between collaborators in this case. If you want to learn more about this explorative study you can read the journal article we finalized a few years after the study.

One problem that was evident from the evaluations with the visually impaired and sighted pupils was that the visually impaired pupil was not aware about what the sighted pupil did when haptic guiding was not utilized. This is why we performed a follow-up study where we added sound cues to the above mentioned dynamic interface to provide feedback on actions taken in the interface (e.g. grasping and putting down objects). We compared the new visual/haptic/audio version to the original visual/haptic one. We managed to show that the dialogue between the collaborators differed depending on which program version they worked in and it was also clear that the work was more effective (measured as time to complete task) in the visual/haptic/audio version. Once again, we could also clearly see how access to haptic feedback influenced the communication. You can read more about this study in this article.

These two studies resulted in a theoretical conference paper presented at HAID (Haptic and Audio Interaction Design) 2009, where we tried to develop a kind of conceptual model regarding haptic communicative functions’ effects on the dialogue between collaborators. This was my first attempt at a meta-analysis of the insights gained within this research area. The paper summarizes and discusses all the effects on the dialogue I had seen in my studies thus far. The paper made quite an impression – it is still the most referenced of all the papers I have produced up until today! At that point we were still quite unique when it came to haptic collaborative environments and I still think I’m one of the very few researchers who study the effect of haptic and audio feedback on the dialogue between collaborators.

The HAID conference paper laid the ground work for the workshop described in the beginning of this post and during the workshop the idea to study collaboration between two sighted persons was introduced and discussed. Next time I write about my earlier work I will introduce my latest study on collaborative multimodal interfaces, that showed that haptic and audio feedback indeed have effects on the dialogue between sighted persons as well!

 

deafblindness · Grant application · Haptics · sonification

Recently submitted my first ever research grant application!

One thing I have not mentioned in this blog before is that I’m one of the researchers behind the newly started network “Nordic Network on ICT and Disability”. This network gathers researchers from universities in Sweden and Denmark, focusing specifically on technology support for people with deafblindness. The reason why I’m a part of the network is primarily that I have developed some multimodal interfaces (based on haptic and audio feedback) for collaboration between sighted and visually impaired pupils in primary school (you can read this article and this conference proceeding for a summary of that work).

I have been thinking about writing a research grant proposal with a group of researchers belonging to the above mentioned network, ever since I joined it. And this year it finally happened! 😀  During a grant club in the middle of March, where several researchers from my division at Uppsala University gathered to write research grant proposals for a day, I ended up with a draft which felt close enough (read more about the very well organized grant club here). The draft was used as basis for discussion in a Skype-meeting with some other members of the network, after which we finally ended up with a research plan everyone felt comfortable with. It was submitted to the Swedish Research Council. I wrote most of the text, but it would never have worked without all the valuable input I got from my colleagues (most of them also co-applicants) both in the form of comments and addition of text chunks.

The proposed research focuses mainly on haptic feedback and how it can be used to support pupils with deafblindness in collaboration with sighted pupils – thus the focus is quite close to the research with visually impaired pupils which I, and several of the other co-applicants, were working on before.

The co-applicants are:

Apart from the above mentioned co-applicants, Charlotte Magnusson ([Research gate]) is also a part of the proposed project as a resource person from CERTEC.

I really believe in this team, since we complement each other in a very good way and we also belong to universities in Sweden which are in the top regarding research on assistive technologies and collaborative haptics. We of course hope the project will be funded, but in case it is not I really hope this team gathers again in search for other possible grants!