Grant application · Group work · Haptics

Recently submitted a project proposal to this year’s Forte junior research grant!

Tordmule

A few weeks ago I wrote a blog post about the spring of opportunities, which basically contained lists of possible research grant and job opportunities that were open or should open during the spring term. Just before lunch today, I submitted the first of my planned project proposals! This one went to Forte and their grants for junior researchers.

The proposed project relates to work that I performed while I was still working at KTH as a Ph.D. student and the content was also inspired by the application that I sent to VR last year (see this blog post). The project is based on the fact that today’s assistive technologies that visually impaired pupils use in the classroom are not really designed for collaborative situations – the technology could sometimes be a hindrance when doing group work with sighted peers. Our hope is that the planned activities will really shed light on the problems this is causing and show how one can make use of modern technology, based on haptic and audio feedback, to find ways ahead. I have already done a few studies in this area which you e.g. can read about in this article (note that this is a pre-print version). I will write more about this when I get an answer from Forte about the draft later on in April.

My co-applicant is Eva-Lotta Sallnäs Pysander from KTH, who was also my main supervisor there. If we get the grant, we will hopefully be able to add a Ph.D. student to the research team as well.

I must say that it’s nice to have this proposal submitted. I really believe in the ideas in it and I also think that the project could make a real difference. Since I’m generally very interested in multimodal interaction and learning, this is also one of the areas I really want to focus on in my research in the future. The other area is eHealth and as I have written before there are quite a few funding opportunities in that area as well during this spring. But right now I’m just enjoying that I have one application out there, just as the razorbill in the image above probably enjoys sitting on a cliff looking out over the ocean…

 

communication · conference · Group work · Haptics · Multimodality

Paper and poster about haptic communicative functions and their effects on communication in collaborative virtual environments

HaptiCom

Yesterday I blogged about a poster and a conference paper that Emma Frid and I developed for the SweCog conference in Uppsala. In this post I will focus on the second poster and paper that Eva-Lotta Sallnäs Pysander and I developed for the same conference.

The poster shown in the picture above, and even more so the paper, summarizes some of the main points made from my doctoral studies. My main focus during those years was collaboration in multimodal virtual environments with special emphasis on how haptic feedback can be used for communicative purposes. Mediated haptic communication has been studied for quite some time, but my specific contribution here has been to develop and test new functions for two-way haptic communication (see short descriptions of the functions on the poster) and also adapt some already developed ones in order to make them work better in a situation when a sighted person is collaborating with a severely visually impaired one in a collaborative virtual environment. There is a real potential in these kinds of functions when it comes to collaboration between sighted and visually impaired – the haptic feedback does not only enable establishment of a common ground about the interface but also effective two-way communication (see examples of results on the poster above). This is very important for the inclusion of visually impaired persons in group work. The example study is reported in much more depth in this article.

Even though the poster and paper include summaries of work already performed and reported, we are in this case even more explicit about the connection to other kinds of haptic communicative functions. This conclusion also takes the work to the next level:

We argue that for effective collaboration and communication to take place in virtual environments by means of haptic feedback the haptic functions need to be designed as to allow for reciprocal exchange of information. That is, both users need continuous feedback from each other during e.g. a guiding process or joint object handling.

The conference paper, on which the above poster is based, can be found here.

conference · Haptics · Multimodality

Paper and poster about gaze behaviour during collaborative object managing

Eyetracking

As I wrote in an earlier blog post I got two posters accepted to the SweCog 2017 conference in Uppsala, October 26-27. Unfortunately I got sick right before the conference so I couldn’t attend myself. The posters were, however, shown during the poster session.

The image above shows one of the posters – the one I created together with my KTH colleague Emma Frid. The study presented in the poster is based on the study I wrote about here, where Eva-Lotta Sallnäs Pysander and Roberto Bresin also participated. In the original study we found indications that gaze behaviour could be affected by haptic and audio feedback in a single user setting. In this new collaborative study presented in the poster, where we used a similar interface, we wanted to investigate if gaze behaviour can be affected by haptic feedback during collaborative object managing.

We have not performed the real experiments yet, but results from a pilot study with a few pairs of users (some worked in a non-haptic version of the interface and some in a haptic version) indicated that haptic feedback could have an effect on gaze behaviour (see e.g. the figures presented on the poster above). The results are not significant, but still interesting enough to make it worth running similar experiments with many more participants. A future step to take could also be to investigated how audio feedback (and/or combinations of haptic and audio feedback) affect gaze behaviour during collaborative task solving.

The poster above summarizes the work done. More information can be found in the published conference abstract which you can find here.

conference · Haptics · Multimodality

Got two posters accepted to SweCog 2017!

SweCog_accept

In an earlier blog post I wrote about my preparations for the Swedish Cognitive Science Society (SweCog) 2017 conference. My plan was to submit at least two papers to that conference and that was exactly what I did. One of the papers, “Using Eye-Tracking to Study the Effect of Haptic Feedback on Visual Focus During Collaborative Object Managing in a Multimodal Virtual Interface” I wrote together with Emma Frid and the other “Haptic communicative functions and their effects on communication in collaborative multimodal virtual environments” I wrote together with Eva-Lotta Sallnäs Pysander. I was first author on both since I led the work and did most of the writing. Earlier this week I got two emails from the conference organizers confirming that both papers had been accepted as posters!

When Eva-Lotta and I submitted the papers (you could only submit one per person) we indicated that we were aiming for oral presentations, but they were both “downgraded” to posters after the reviews. When it comes to first paper, written with Emma, I can understand it, since we were reporting on a pilot study and there were quite a few papers submitted by other researchers which reported on full-scale experiments and evaluations. The other one, on haptic communicative functions, were more theoretical in nature and in that case I think the main problem was the 500 words limit – we couldn’t really elaborate enough on our main findings, when most of the space had to be used to define and explain haptic communicative functions. Anyhow, I’m very happy that the papers were accepted and that we will be able to discuss our work with others during the conference.

The second confirmation email, about that paper on haptic communicative functions, actually included an interesting twist – one of the reviewers of that paper recommended that the paper should be presented by means of a live-demo during the poster session! That really came as a surprise (a positive one) and the organizers were really willing to work with us to make the live-demo happen. Unfortunately, one problem is that the studies referenced in the paper (about an evaluation and an experiment, respectively, during which pairs of users were collaborating by means of haptic and audio communicative functions – see this and this preprint article) used virtual environments based on outdated API:s that no longer work. I’m not sure that I can implement the environments using the newer haptics API Chai3D in time for the conference. But, no matter what, will still have the poster and the possibility to discuss and explain our findings.

So, the only thing remaining now (apart from trying to get a demo working) is to create two informative posters. After the conference I will get back to this topic and elaborate some more on the work presented on the two posters, so expect more posts about SweCog 2017 and my contributions two it!

 

DOME · eHealth · Haptics · Medical applications · Medical Records Online · National patient survey · Summer school

Today I celebrate my blog’s first anniversary!

Ren1

Exactly one year ago I wrote my very first blog post! You can read that short post here. From the beginning my intention was to write two posts a week, but for different reasons my average during this first year is 1.4/week. During this first year the blog has had 2782 views by 1466 visitors. During the first couple of months the number of views were under 100, but I’m glad to see that the numbers have kept increasing – the number of views in September was 491, and 53 views have been accumulated during the first days of October this year.

Since this is a special blog post, the picture I chose is not by any means related to my work. Instead, I chose one of the pictures I took while hiking in northern Sweden (Abisko) about two years ago – I just love the nature up there!

As a kind of celebration, I will here present a top 5 list with the five most read posts:

  1. My colleague, Thomas Lind, successfully defended his thesis today!

This is one of my latest posts, which I uploaded in the middle of September. Despite the short time frame this is, by far, the most read post! The post is about the defense resulting in my colleague, Thomas Lind, getting a Ph.D. degree.

  1. EHealth summer school in Dublin, day 5

Those who have followed my blog during the latest months know that I have been writing quite extensively about a summer school I attended – one week in Dublin and one week in Stockholm. I’m very happy to see that one of those posts is on this list, because it took quite a lot of time to write them. This particular post is also a kind of summary post which includes links to the other posts about the week in Dublin. The summer school was a nice experience in so many ways and I really encourage you to read those posts if you are interested in eHealth/mHealth design (the Stockholm posts, although not on the top 5 lists, can be found here).

  1. A very successful session about patient accessible electronic health records at Vitalis 2017!

This post is not only on the top 5 list regarding views, but it is, by far, the most shared post on social media. The post summarizes an 1.5 hours session hosted by the DOME consortium at Vitalis last spring. I really hope we will get the opportunity to do something similar at Vitalis 2018! Read this post if you want to know about the state of art regarding patient accessible electronic health records in Sweden.

  1. The team behind a new large patient survey on electronic health records in Sweden!

I’m also happy to see this one on the list – since this post represents one of the big studies I’m currently leading. This particular study is based on a large national survey focusing on patients’ experiences with and attitudes towards the patient accessible electronic health system Journalen. In this post I introduce all researchers that work with the study.

  1. Haptic feedback in medical applications

The fifth most read post belongs to the blog series on haptic feedback as an interaction modality, which I started last autumn. This particular one concerns how haptic feedback can be utilized in medical applications. In this post I also introduce my own work within this area, which I carried out as a Ph.D. student at KTH.

So, these were my five most read posts, and I’m glad to see that they relate to different areas. The only area not covered in this list is pedagogical development. This might change during the next year, however, since I will most probably get an extensive study on Twitter as a communication medium in higher education courses published and I will also write a series of posts about a basic course in human-computer interaction which I will be responsible for at Uppsala University (starting October 30).

I have really enjoyed the blogging activity and will definitely continue to update this blog regularly, so stay tuned for more!  🙂

Cognition · conference · Haptics · Multimodality

Preparing submissions for the SweCog 2017 conference, held at Uppsala University!

SweCog2017_Uppsala

This week, I’m preparing submissions for this year’s version of the SweCog (Swedish Cognitive Science Society) conference. This conference covers a broad range of topics related to cognitive science. When I participated last year, when the conference was held at Chalmers, Gothenburg, I did not present anything (actually, none of the participants from Uppsala University did), but the situation this year is quite different since Uppsala University is hosting the event!

I really enjoyed last year’s conference much due to the large variety of topics covered and the very interesting keynote lectures. It was also (and still is, I assume) a single track conference, meaning that you will not have to choose which paper session to attend. As I remember there were ten paper presentations in total, three keynote lectures and one poster session during the two days conference. You can read more about my experiences from SweCog 2016 in this blog post, summing up that event. I also wrote summaries from day 1 and day 2.

Since the only thing that’s required is an extended abstract of 1-3 pages (and max 500 words), I’m working on several submissions. A topic that was not covered during last year’s conference was collaboration in multimodal environments and specifically how different combinations of modalities can affect communication between two users solving a task together. Since that is one of my main research interests, I now see my chance to contribute! The deadline for extended abstract submissions to SweCog 2017 is September 4, so there is still a lot of time to write. The conference will be held October 26-27 at Uppsala University. Since registration to the conference is free for SweCog members (membership is also free), I expect to see many of my old KTH colleagues at Uppsala University during the conference days! 😉  You can find more information about the conference here.

Before I started planning for contributions to SweCog 2017, I invited some of my “multimodal colleagues” from KTH to join the writing process. As a result, Emma Frid and I will collaborate on an extended abstract about a follow-up study to the study I present here. Thus, our contribution will focus on how multimodal feedback can affect visual focus when two users are solving a task together in a collaborative virtual environment. Since I have not yet heard from any other colleague, I plan to write another extended abstract on my own, about how multimodal feedback (or rather combinations of visual, haptic and auditory feedback) can affect the means by which users talk to each other while working in collaborative virtual environments. Maybe, I will also throw in a third one about the potential of using haptic guiding functions (see this blog post for an explanation of this concept) in situations where sighted and visually impaired users collaborate.