DOME · eHealth · Haptics

Looking back at 2016


I will soon continue on my blog series about haptics as an interaction modality, but before I do I will take this opportunity to look back at 2016 before the new year begin. A lot of things have happened this year and quite a few of them have been unexpected. Before I continue I just want to point out that the image above, not in any way connected to my teaching or research, is taken by me in spring 2016. It’s there just because we are entering a new year and the image represent a kind of natural fire work.

The year started with some teaching and research at KTH within the scope of my newly started consulting company Jonas Moll Consulting. The teaching mostly concerned work within a course about haptics. I assisted in developing parts of that course back in 2011 and I have been a part of the course ever since. I usually grade individual literature reports, supervise labs, assess the technical heights of projects and hold a few lectures. I have really enjoyed working with this course and I hope I will be able to at least hold a few guest lectures on it next year as well. The research performed during spring mostly concerned thorough analysis of results and writing manuscripts to be submitted to journals. I will write more about the studies when the manuscripts have gone through the review process and been transformed into published articles. A small selection of preliminary results can, however, be found here (about interaction between vision, touch and hearing in multimodal virtual environments) and here (about using Twitter in a university course).

In May I got an email from a colleague about a new postdoc position, focusing on eHealth, at Uppsala University. It sounded very interesting since I had been working in a medical project before (collaboration between KTH and Karolinska Institutet) so of course I applied. A few weeks later I was called to an interview and quite soon afterwards it was confirmed that I got the postdoc job!  🙂

I started my work as postdoc 1/9 (at 70%, since I had accepted to work as a consultant in a course on communication before I knew about the postdoc position) and a lot has happened since then.

One of the first things I did was to start up two studies – one interview and survey study with physicians and nurses and one national survey study (the results had already been gathered, but analysis and reporting remained) – and invite researchers to join in. It turned out that quite a few wanted to be a part of the studies so now I’m leading two studies with 10 researchers in each. To complicate things further, these researchers are distributed among no less than six universities (one in Germany)! Almost all material is prepared for the interview/survey study, so we should be able to start quite early in spring after the ethical review. An initial visual inspection of the data from the national patient survey have already brought forward some interesting results and the next step is to bring everything into SPSS for detailed analysis. I will write more about the results from and progress of these studies later on.

Before I started my work at Uppsala University I had not been to that many conferences – my focus had been to write journal articles. That situation changed quite abruptly during this autumn. I visited three conferences during a period of two months! Two of them were held in Gothenburg and one in Oslo. The first conference I went to was the SweCog 2016 conference in Gothenburg. I wrote three blog posts about it (summary, Day 1, Day 2). I did not present anything during that conference. The second one was the NordiCHI conference in Gothenburg. This time I took part in a workshop and I did it from a patient’s perspective! This was the first time I could, in my research, use the fact that I have a chronic rheumatical disease. The research contribution can be found here for those who are interested and I also wrote a blog post about it. Here you can also find my overall summary of the conference. The last conference I attended was the EHiN (EHealth in Norway) conference in Oslo. This time I was one of the organizers behind a workshop, in which I again participated as a patient! The role play activity I took part in is summarized here and you can also find a summary of the whole conference here.

Other new things that happened to me during the last autumn were that I got the chance to be an invited speaker and that was interviewed for a podcast. I will write another blog post about these related events when the podcast has been made available online. In December I also joined a group who will work on a new EU proposal. Exciting, to say the least! During autumn I also, for the first time, took part in going through applications and interviewing persons who had applied to a new Ph.D. position.

So, quite a lot has happened since I woke up 1/1 2016, to say the least! I’m quite sure 2017 will also be an exciting year. I know that I will be doing some teaching in courses at Uppsala University, that we will conduct large studies and get numerous articles as a result and that I will write on applications and organize some workshops, but looking back at the last autumn I’m sure that quite a few surprises will pop up as well!  🙂


Happy New Year everyone!


Group work · Haptics

Haptic interaction research: one great memory!

Since Christmas is coming up I will in this blog post break my series on haptic interaction and talk about a specific memory from my research (rather than discussing an application area in general). The memory I’m presenting here comes from evaluations performed in schools several years ago.

The evaluations were performed within the scope of an EU project (MICOLE), which aimed at developing tools for supporting collaboration between sighted and visually impaired pupils in elementary school classes. One of the major problems today is that sighted and visually impaired pupils use different work materials which are not easily shared, sometimes causing them to do “group work” in parallel rather than together. By developing environments based on haptic feedback, which could be accessible to both sighted and visually impaired pupils, the project tried to address this important problem.

One of my contributions to the project was a 2D (or rather 2.5D) interface presenting a virtual whiteboard on which geometrical shapes and angles were drawn (felt as raised lines). Two haptic devices were connected so that one sighted and one visually impaired pupil could be in the virtual environment at the same time. The tasks, during the evaluations in schools, were to go through the shapes and angles respectively together and categorize them. These kinds of tasks were chosen since we knew that the pupils in the schools where we performed the evaluations had just started to learn geometry.

When we came to one of the schools we soon realized that the pupils had just started to learn geometrical shapes, but they had never talked about angles – they could not discriminate between right, acute and obtuse angles. Therefore we needed to explain how to discriminate between the different types of angles before the evaluation started. When a task presenting 10 different angles (some right, some obtuse and some acute), forming an irregular star-shaped figure, had been loaded the visually impaired pupil took a few minutes to explore the interface in order to locate the angles in the figure. After this, the pupil moved directly to one of the angles and categorized it, correctly, as acute – something that also the sighted pupils agreed on. They then browsed through the angles, one by one, always focusing on the angle the visually impaired pupil was currently “touching”. They always came up with the right answers!

I really enjoyed watching this group working in the interface. It was so obvious that the application could be used as a shared interface for discussing a concept which was new to everyone. It was equally obvious that the visually impaired pupil, after just a few minutes of exploration, knew exactly how to navigate to the different angles in the star-shaped figure (without having to follow all edges!). This enabled the visually impaired pupil to take and keep the initiative during the discussions! Thus, the interface clearly, at least in this particular case, enabled inclusion of the visually impaired pupil in the group work. Watching this group work in the interface, and especially seeing the visually impaired pupil leading the discussion, is definitely one of the greatest memories I have from my research on haptic interaction.

I will get back to this study, and related studies, in a later blog post related to communication in haptic interfaces. Those who are curious about the study I briefly introduced above can also read this article.


Merry Christmas to everyone!



Haptics · Medical applications

Haptic feedback in medical applications


This second blog post about haptics as an interaction modality, will be devoted the area of medicine. During recent years haptics has become more and more important in medical applications, when it comes to both training, simulation and image analysis. An important rationale behind using the haptic modality in medical training applications is, of course, that you can get “the real feeling” and practice motor skills without actually having to practice on a patient.

There are several examples of cases in which the addition of haptic feedback to a visual interface has proven to make a positive difference in medical simulators. I will just bring up two examples here to illustrate the potential of haptic feedback.

A group of Chinese researchers developed a simulator for training Chinese acupuncture (reference is, sadly, behind a pay wall). Places where a simulated needle (controlled by the haptic device I described here) could be inserted were highlighted on a virtual patient’s back and when inserting the needle it was also possible to feel the difference between different tissues (the stiffness varied). By using this simulator students could not only learn where needles should be inserted but also how it should feel when the needle was at the correct place.

A haptic simulator, named the Kobra, for extraction of wisdom teeth, was actually developed at KTH, by Ph. D. Jonas Forsslund. This simulator is shown in the image above (source: Forsslund Systems). In this simulator the user can perform an entire wisdom tooth extraction procedure by using a virtual drill, operated by pedals. During the procedure, a semitransparent mirror is used for hand-eye coordination and a pair of 3D glasses gives the depth perception. In the image above you can also see the haptic device (shown as the drill in the virtual environment) and the mannequin placed under the semitransparent mirror. The positions of the mannequin and of the virtual head are synchronized so that the hand can rest on the mannequin while using the virtual drill on one of the wisdom teeth! The Kobra project actually started off as a master’s thesis project in 2008 and has now become a commercial product! You can read a lot more about this simulator on: Forsslund Systems.

Both of the above examples illustrate one of the important functions haptic feedback can play in medical simulators – you can practice motor skills and certain procedures without having to be dependent on involving patients. This is extra important in the dental simulator case, where it is possible to practice procedures which are not that easy to describe in theory. You can practice the same procedure over and over again without wearing out a material or risk hurting a patient. During initial evaluations it was shown that it was indeed possible to simulate the procedure of bone drilling, but effects on performance could not be shown. I think there is a good chance that future evaluations could show a positive effect on learning outcome.

Another area, not brought up yet, where haptic feedback can be an aid within the medical domain is surgical planning. I performed research myself within that area during my Ph. D. studies at KTH, when I evaluated an application for surgical planning, developed by Jonas Forsslund, with doctors at the Karolinska Hospital. I will, however, get back to that particular study in a later blog post, when results from the study have been published. Those who are curious can read about some pre-studies and the haptic application here (only the abstract is shown, but if you have access to IEEE-publications you can follow the provided DOI to find the paper).


eHealth · Medical Records Online

Some remarks on a recent eHealth conference


A few weeks ago I participated in the EHiN (EHealth in Norway) 2016 conference in Oslo. The conference was held in parallel with the European Telemedicine Conference at Oslo Spektrum. This was the first time I had participated in an eHealth related conference and it was a very interesting experience, both regarding activities and the conference venue. I have already published a blog post about the one-day workshop I was one of the organizers behind, so I will focus on the rest of the conference here.

There were several tracks during this conference – there were often seven rooms that were used in parallel and most probably over 1000 participants. The “heart” of the conference venue was the big expo area shown in the picture above. From there you could easily reach all rooms, where presentations were held, which were spread around the big open area. The opening keynotes were also held in the expo area on the big stage in the middle of the room. I must say that this was the most spectacular conference venue I have ever seen!

Apart from tracks relating to eHealth and telemedicine in practice, there was also a so called “scientific track”. During two of the sessions in the scientific track Kristina Groth, a research colleague from KTH during my time as a Ph.D. student, presented findings from telemedicine projects she has been working with. I will write more about our joint work within an earlier medical project later on. Overall, I the found content of the sessions I followed very interesting and some themes really stood out.

One interesting theme was solutions for transmitting health data from the ambulance to the hospital so that medical staff could be prepared to take necessary actions directly when a patient arrives at the hospital. The current systems in ambulances (in many countries) are not very compatible with the systems at the hospitals which means that often a thorough examination must take place at the time a patient arrives to the hospital. This can be very problematic for e.g. stroke patients. One of the presentations related to this theme showed that a solution currently tested, transmitting vital values from an ambulance to a hospital, could shorten the time before a treatment can begin by 30 minutes!

Another interesting theme was the one of gamification in health care. Several examples were given showing that e.g. games could help in rehabilitation and help children with special needs. Especially when it comes to rehabilitation of neurological injuries it was made very clear that games had a real potential. Among other things, injured persons did not think so much about e.g. pain when playing games which required movements necessary to train the injured parts. This, in turn shortened the time needed for rehabilitation. Exercising by throwing darts or playing virtual tennis was also considered to be funnier than to just perform “regular” exercises. No surprises there!  🙂

Other interesting themes were telemedicine solutions for communicating with patients in their homes (Kristina Groth’s presentations) and eHealth solutions that enable self-care at home. Overall, it became clear from this conference that the eHealth field is very much evolving, coming up with solutions that will definitely make life easier for many people. I’m very glad that I can be a part of this research field!