DOME · eHealth · Medical Records Online

Video explaining eHealth research within the DOME consortium


I have not been blogging for a while now, mostly due to a bad cold and time pressure with finalizing an application for ethical review. I will get back to the application in a later blog post this week and I will of course also continue with my blog series on haptic interaction.

I have mentioned several times that I do research within the DOME-consortium in Sweden and since I still have limited time I will “cheat” somewhat and instead let my colleagues Åsa Cajander (coordinator of the consortium) and Isabella Scandurra explain the idea behind the consortium and some earlier eHealth related research: DOME research video (the first half of the video concerns DOME)!

(The video, which I can really recommend, is from their presentation at the EHiN (EHealth in Norway) conference in Oslo last November).

DOME · eHealth · Medical Records Online

Interviewed on a podcast!

I hinted in earlier blog posts that I had been interviewed during December 2016 and today I got confirmation about that the podcast has been published! Åsa Cajander and I were invited to speak at an internal seminar at Södertörns högskola 14/12 last year. The focus of our presentation was on patient access to medical records online, and especially on the differences between the physicians’ and the patients’ views of the system.

We used the role play format yet again (as we did in this workshop) – Åsa played the physician who was concerned about the effects that online medical records could have and I played the patient who was positive towards most aspects of the online medical records. Presenting with Åsa at the seminar was overall a nice experience and the people in the audience (mostly associate professors in media technology) were very active and asked many interesting questions and came with ideas on what to look for in our upcoming studies.

After the seminar Åsa and I were interviewed on a podcast, which you can find here (in Swedish)! It was quite an experience for me, since I have never been interviewed before. The interview was based on the topics and results brought up to discussion during the seminar. If you listen to the short podcast you will, among other things, learn more about what Åsa and I work with at Uppsala University, and hear about some preliminary results from the large national patient survey study I’m currently leading.


A special reminder of my time as a Ph.D. student!


I will in this post take a short break from my blog series on haptic feedback as an interaction modality, since I was reminded about my time as a Ph.D. student today, by a very special sign – the orchid I got from my main supervisor, Eva-Lotta Sallnäs Pysander, the day I defended my thesis! This orchid was, of course, in full bloom when I got it, but the flowers naturally withered after a few months. I didn’t give up and after several months caring for a plant that only had leafs, buds started popping up and the first one burst exactly one year after I graduated, 28/11 2014!

Last year small buds started showing at the three-year anniversary of my graduation, but the first one was not in full bloom until today. So, the orchid kind of missed the anniversary, but I was still reminded of a very interesting and joyful period in my life. During my time as a Ph.D. student at the Media technology and Interaction Design department, KTH, I teached quite a lot, I ran several studies both at KTH and Karolinska Institutet and I even had to redo an entire experiment due to results that had been affected by a design flaw, but I still never had a boring moment and I always enjoyed my work. The excellent supervising skills of Eva-Lotta Sallnäs Pysander and Kerstin Severinson Eklundh have been very important in this regard and so has the fact that I was employed by the institution and not a specific project.

The best part about being a Ph.D. student was that I continuously had to challenge myself. I, of course, had regular contact with and a very good collaboration with my supervisors, but I was still the one in charge of several studies and my own situation as a whole. Planning large studies and carrying them out, especially when they concern situations where the results could really make a difference (like collaboration between visually impaired and sighted pupils in schools), is something I find very rewarding. I often managed to stick with the set up plans, but during a period when my medication for my rheumatical disease was still being adjusted and I was forced into sick leaves from time to time, it was tough. But, still, my five years as a Ph.D. student was one of the best periods I have experienced this far!



communication · Haptics

Haptic communicative functions


This is the fifth post in my blog series about haptics as an interaction modality and this time I start focusing on my main area – collaboration in multimodal interfaces. In earlier posts I have written about:

Ever since I started working with my master’s thesis project back in 2005 I have been focusing on collaboration in multimodal interfaces and specifically on how one can design haptic functions for collaboration and communication between two users working in the same environment. Collaborative haptic interfaces has been around for quite some time – one of the first examples is an arm-wrestling system using specialized hardware enabling two users to arm wrestle over a distance. Other commonly known, early, examples enabling a kind of mediated social touch are HandJive and InTouch. One thing these early systems have in common is that they use specialized hardware which is quite limited in scope. More examples of early systems and in-depth discussions about mediated social touch can be found in this excellent review.

During the two recent decades more widely applicable collaborative haptic functions for collaboration in virtual environments have been developed. Such functions can e.g. enable two users to feel each other’s forces on jointly held objects or make it possible to “shake hands” by utilizing magnetic forces between the two users’ proxies in the virtual environment. One of the earliest examples of an environment supporting these kinds of collaborative haptic functions, or guiding functions as I use to call them, is the collaborative text editor developed by Oakley et al.. Apart from the obvious functions needed to edit a document each user could also use a Phantom device to find positions of and/or communicate with the co-authors. An example of a haptic function was a kind grabbing function (similar to the shake hands function mentioned above), making it possible to grab another user’s proxy moving it to another part of the document. Other examples were a kind of “locate” function dragging one’s own proxy to another user’s proxy by a constant force or a “come here” function dragging another user’s proxy to one’s own position.

Later examples of virtual environments enabling these kinds of haptic collaborative functions are a collaborative drawing application developed at CERTEC, Lund University and applications for joint handling of objects evaluated by my former Ph.D. supervisor Eva-Lotta Sallnäs Pysander during her years as a Ph.D. student (thesis link). The latter examples are most relevant for me, since much of my work during my period as a Ph.D. student at KTH focused on collaborative interfaces in which two users can work together to move and place virtual objects. I will get back to my own applications later on, in a blog post about haptic interfaces supporting collaboration between visually impaired and sighted persons.

Above, I have just provided a few examples of collaborative haptic functions which can be used to control the forces provided by one or more haptic devices. The functions I think are most interesting to explore are the ones that enable physical interaction between two haptic devices in that both users can feel each other’s forces on jointly held objects or feel each other’s forces when holding on to each other’s proxies. These kinds of functions enable interesting means of communicating physically in virtual environments, especially in cases in which the users are not able to talk to each other face-to-face or point on the screen. Imagine, e.g. a scenario in which two users are exploring different parts of a complex multimodal interface showing distributed data clusters (what those clusters represent is not of importance here). In such an interface it would be very cumbersome to try to describe to the other person where a certain interesting cluster has been located. In this case the user who wants to show something s(he) found can grab the other user’s proxy and drag him/her to the relevant cluster. This possibility would probably simplify communication about the explored dataset (explaining where you can find details in a complex interface can be extremely cumbersome). This is, of course, just a made up example but it can be applied to many scenarios especially in cases where important parts of an interface are visually occluded. I will get back to joint handling of objects in a later blog post in this series. I will discussion the potential of using haptic feedback when exploring huge datasets in one of the upcoming blog posts.

Last, it is interesting to contrast functions enabling physical interaction between two haptic devices with functions only enabling a one-way communication (like the “goto”-function mentioned above). Using a one-way function enable some kind of communication in that one person’s proxy is “dragged” to another one’s, but the haptic function is only applied to one of the users in this case – there is e.g. no way for the other user to tell if the one being dragged actually wants to be dragged. When using a two-way haptic communicative function both users can feel forces from each other enabling a richer communication. Apart from enabling joint handling of objects, where the coordination of movement is made possible by both users feeling the other one’s forces on the jointly held object, two-way haptic communicative functions make it possible to e.g. clearly communicate to the other user that you do not want to be dragged somewhere. The potential these functions can have in situations where visually impaired and sighted users collaborate in virtual environments will be the topic of my next post in this series!


games · Haptics

Haptic feedback in games


Now it’s time for the forth post in my blog series about haptics as an interaction modality. In this post, I will write about games – an area where I think haptic feedback can be used in a much greater extent than it is today. The earlier posts in this blog series were:

Haptic feedback has been used in games for quite some time. I think that everyone has some kind of relation to the joysticks used in e.g. flight or car simulators. Most joysticks do not only enable some kind of steering, but also generate haptic feedback often in the form of vibrations or resistance to motion. If we take an ordinary flight simulator joystick as an example, the player can experience heavy vibrations when the plane is stalling, as a kind of warning that the lift is beginning to decrease.

During recent years new input devices have been developed with the potential to really change the way we experience different kinds of games. I have already introduced the Phantom Omni in earlier posts – a device that makes it possible to not only feel texture, stiffness, friction, etc., but also to lift and move around virtual objects. This clearly opens up new possibilities for game development, especially since the Novint Falcon (picture above) started to spread. As far as I can understand haptic feedback is, in the vast majority of games where this kind of feedback is utilized, still limited to vibrations and resisting forces despite the fact that modern devices greatly widen the possibilities. Below, I will add a few thoughts about what can be done to utilize the unique aspects of haptic feedback in games. There are, of course, many more things you can do apart from the ones discussed here.

Imagine, e.g. a haptic game where the player not only has to worry about navigating to the right place and/or interacting with different objects, but also need to watch out for deadly magnetic wells “physically” pulling the game avatar towards them. That would certainly add a unique dimension to a game, as would magnetic “guides” pulling the user in a certain direction making him/her aware that e.g. an object is approaching. Every year students are creating simple games, based on magnetic objects which should be avoided, in the haptics course at KTH. Here is an example video from a simple game where the user need to navigate through a mine field to find a treasure! It is easy to add more levels and objects, so the game is quite scalable and the idea can be applied to many different scenarios. Another game from another course round used a similar idea – that you should avoid being dragged into objects – but in that case the objects had different widths and were moving from right to left. The user should stay clear of the objects for as long as possible.

There are many games out there today which are based on the exploration of large and different environments. Zelda and the Final Fantasy series are among the most known examples. In those kinds of games haptic feedback could also add an interesting dimension, when it comes to categorizing objects and/or explore occluded areas hidden behind or within buildings, trees or cliffs. In these kinds of games you still need ordinary input controllers, of course, but a haptic device could be used as a complement. Imagine that you walk around in a large virtual environment and come to a well which you cannot go down into. You could then switch to a haptic mode and send down a probe to feel what is at the bottom. If something is down there you could also pick it up. You could even take this further and have small puzzles in hidden places (like in the well example), where you need to feel differences between e.g. friction, surface texture and/or weight of different objects. If you place the objects in the correct order you could unlock some secret.

Haptic feedback could also be used a lot more in puzzle and maze games – there are quite a few of them out there today. If you add a haptic feedback dimension to a puzzle game you can e.g. use weight and texture of different pieces as additional input. A haptic-only puzzle would be very interesting to try out! You can also play around with haptic mazes and use friction, texture and maybe even magnetic forces to provide additional information about where you are, provided that you cannot see your own location. Quite a few projects in the haptics course have been based on haptic mazes.

Above, I have sketched on a few ideas on how one can utilize some unique aspects of haptic feedback in games. Since we already have the technology, I think it is important that we try to take a step further from games where haptic feedback is limited to vibration, resistance and indications of getting shot at and instead look at more creative ways to use haptic feedback. There are some creative solutions out there today, but I think many games could still benefit from using e.g. the ideas discussed above!