During the last couple of weeks I have worked a lot on a research grant application to Vinnova (call: Digital tools) and today it was finally submitted! The application is based on a collaboration between the following partners, who all apply for funding in the application:
The proposed project is based on earlier proof-of-concept work that I performed at KTH regarding multimodal learning environments supporting collaboration between sighted and visually impaired pupils. I will of course be able to say a lot more about the involved partners and the content of the project after Vinnova has reviewed the application (in the middle of November). I really hope that the project will be funded since I really miss working within this research area, and I also think that the research is important for society.
One confirmation of that both our old and planned research in this area is considered important is that representatives from both the Swedish Association of the Visually Impaired and the National Agency of Special Needs Education and Schools decided to take active part in the project (in case it gets funded). I’m sure they would not do that if they didn’t identify a high importance for society, and especially the main target group, as well as an importance on a national level. Even if the project doesn’t get funding we should definitely make sure to continue the collaboration and discussions with the involved partners – there are always new opportunities for research up ahead and as long as clear mutual benefits can be identified the collaboration should move on.
Stay tuned for more! 🙂
In my last blog post I presented an overview about my research within the eHealth domain. In this blog post I will do the same thing, but for my other main research field – multimodal interaction in virtual environments.
What have I done related to multimodal interaction?
Even though I have spent the last couple of years focusing mainly on eHealth, I have done a lot of research – especially as a Ph.D. student at the Royal Institute of Technology – related to multimodal interaction. Most of this research has been focused on multimodal learning environments for collaborative task solving between sighted and visually impaired persons. Haptic feedback has played a major part in the collaborative virtual environments that I have designed and evaluated both in lab settings and in the field in e.g. primary schools. Quite a while ago, I wrote a blog series on haptic feedback focusing on the work I performed within the scope of my doctoral studies. Here are the links to those posts:
During my time as a postdoc at Uppsala University, I also performed some activities related to multimodal interaction. Most of this time I devoted to research grant applications and I also wrote a few conference papers. You can read a short summary of these activities here.
In total, my research on multimodal interaction has, up until today, resulted in the following five journal publications (some links lead to open access publications or pre-prints):
and the following 11 conference papers (some links leads to open access publications or pre-prints):
- Haptic communicative functions and their effects on communication in collaborative multimodal virtual environments. Swedish Cognitive Science Society Conference 2017 (Authors: Jonas Moll and Eva-Lotta Sallnäs Pysander)
- Using eye-tracking to study the effect of haptic feedback on visual focus during collaborative object managing in a multimodal virtual interface. Swedish Cognitive Science Society Conference 2017 (Authors: Jonas Moll, Emma Frid)
- An exploratory study on the effect of auditory feedback on gaze behavior in a virtual throwing task with and without haptic feedback. Sound and Music Computing 2017 (Authors: Emma Frid, Roberto Bresin, Eva-Lotta Sallnäs Pysander and Jonas Moll)
- Sonification of haptic interaction in a virtual scene. Sound and Music Computing Sweden 2014 (Authors: Emma Frid, Roberto Bresin, Jonas Moll and Eva-Lotta Sallnäs Pysander)
- Design and evaluation of interaction technology for medical team meetings. INTERACT 2011 (Authors: Alex Olwal ,Oscar Frykholm, Kristina Groth and Jonas Moll)
- Pointing in multi-disciplinary medical meetings. Computer-Based Medical Systems 2011 (Authors: Eva-Lotta Sallnäs Pysander, Jonas Moll, Oscar Frykholm, Kristina Groth and Jonas Forsslund)
- Communicative Functions of Haptic Feedback. Haptic and Audio Interaction Design 2009 (Authors: Jonas Moll, Eva-Lotta Sallnäs Psyander)
- Designing a collaborative learning tool for collaboration between visually impaired and sighted pupils. NordiCHI 2008 (Author: Jonas Moll)
- Integrating Audio and Haptic Feedback in a Collaborative Virtual Environment. International Conference on Human-Computer Interaction 2007 (Authors: Yingying Huang, Jonas Moll, Eva-Lotta Sallnäs Pysander and Yngve Sundblad)
- Group work about geometrical concepts among blind and sighted pupils using haptic interfaces. World Haptics 2007 (Authors: Eva-Lotta Sallnäs Pysander, Jonas Moll, Kerstin Severinson Eklundh)
- Haptic interface for collaborative learning among sighted and visually impaired pupils in primary school. Enactive 2006 (Authors: Eva-Lotta Sallnäs Pysander, Jonas Moll, Kerstin Severinson Eklundh)
My ongoing research within multimodal interaction
Currently, there is not much going on related to this research field (at least not in my own research). The only ongoing activity I’m engaged in is an extensive literature review related to communication in collaborative virtual environments which will lead to a theoretical research article where I will discuss different technical solutions for haptic communication in the light of the research I have performed within the area up until today. I’m collaborating with my former Ph.D. supervisor Eva-Lotta Sallnäs Pysander on this activity. I hope that this research activity will help me in my continued research on collaboration between visually impaired and sighted pupils based on different types of tasks and learning material.
Upcoming research on multimodal interaction
As I wrote in a recent blog post multimodal interaction, with a focus on haptic feedback, seems to be a new research area at the Centre for empirical research on information systems (CERIS) where I just stared my assistant professorship. Thus, this is the research area in which I can contribute with something new to the department. An area that is already represented at the department, however, is “Information Technology and Learning”, which seems to be a perfect fit in this case!
Last year, I also submitted a research grant application focusing on continued work with collaborative multimodal learning environments. Unfortunately, that one was rejected but no one is giving up. I will work somewhat on revising the application during the autumn and submit as soon as a suitable call pops up. Maybe I will also have additional co-applicants from the CERIS department by then.
Today I can finally take the next step on the academic ladder, since I’m starting up my new job as assistant professor in Informatics at the centre for empirical research on information systems at Örebro University! In November 2018 I applied for the position and in the middle of the spring 2019 I was called to an interview. A few months later I was offered the position. I’m very excited about this great opportunity and I of course intend to make the most out of it. After a very long blog break (mostly due to health issues and the fact that my research efforts during the spring has been rather minor), this also seems like a good opportunity to start posting again.
The assistant professorship is the first step on the so called tenure track. It is an academic position limited to four years, but the intention is often (as in my case) to promote the assistant professor to an associate professor towards the end – a position which is not limited in time. My job includes 70% research and 30% teaching, which is quite common for assistant professorships. I’m not sure yet where e.g. service and communication (like administration, blogging and interviews) fits in.
The job as assistant professor in Informatics is a very good fit for me, since I will be able to continue to work with all my main research interests (the main theme is computer supported communication):
– I will continue looking at how patient accessible electronic health records (PAEHR) affect the communication between patients and care professionals. One thing I’m particularly curious about, and that is actually the focus of a project grant application currently in review, is how one can incorporate the PAEHR as a communication mediator curing doctor-patient meetings. Another application in review is about the effects and implementation of psychiatry records online.
- Multimodal interaction
– I will also continue looking at how multimodal feedback (especially haptics and sound) can be used to promote collaboration between sighted and visually impaired pupils/students in group work. Most of today’s assistive technologies that are used in school settings are not adapted for collaboration and this is highly problematic when it comes to inclusion of visually impaired pupils/students in group work settings.
- Social media in higher education
– My intention is also to continue investigating how social media like Twitter and Facebook can be used as supplementary communication channels in higher education courses.
When it comes to the areas of eHealth and social media in higher education, research is already being conducted by my new colleagues at Örebro University. Multimodal interaction would however be a new research theme for the department. I will elaborate on the different themes listed above in later blog posts as work is progressing. Other research themes from the department (like computer security and ICT for development) could also be added.
I have not heard anything yet regarding the teaching, but given the department’s focus I guess I could be involved in master’s thesis supervision, human-computer interaction project courses and programming courses. I will write more about the teaching part when I know more.
The blog image that I used for this post is one of my own – I took it a few weeks ago during a week I spent in Abisko in northern Sweden.
This will be a very special year for me, since my postdoc period at Uppsala University will end in September and I currently have no idea what will happen after that. I may be able to find a way to continue my work in Uppsala, but that is far from given. But, as the heading of this post suggests (as well as my picture), there are a bunch of possibilities up ahead. I now have several project ideas and the same goes for my current colleagues at Uppsala University and my former colleagues at KTH. On top of this, there are two interesting assistant professorship and three associate professorship jobs to apply for!
When it comes to open positions, all I have found are in the area of healthcare/eHealth (two of these, from Uppsala University, are technically about information systems in general, but can easily be angled towards health systems). One of the open positions belong to KTH, two to Uppsala University and two to Örebro University. I will write more about these when I have applied for the respective positions. The downside here is that I will probably not be able to work with multimodal interaction if given one of these positions, but on the other hand eHealth is an area that I am really interested in.
Even though it would be great to get a semi-secure assistant professorship, getting research projects would be even more interesting since these often run for four years (same duration as a regular assistant professorship). Research projects which you define yourself, in collaboration with colleagues, are probably also even more in line with your main interests. The current plan is that I will be the main applicant on one project application to Forte (Swedish Research Council for Health, Working Life and Welfare) and co-applicant in one project application to the VR (Swedish Research Council), one project application to Vinnova (the challenge driven innovation call) and two project applications to AFA Försäkringar (AFA Insurance). On top of that I will also be co-applicant on a research program application to Riksbankens Jubileumsfond and one grant application for Interdisciplinary Research Environments to VR. Four of the applications concern collaboration in multimodal environments and three concern eHealth systems. The ideal result here would be to get one project/program in each research area, since that would make it possible for me to continue working with my two favorite topics. One can always hope…
If given the choice I would pick the research program and the interdisciplinary research environment. By the way, fantastic things happen in the world of research applications from time to time – during autumn 2016 the researcher in charge of our HTO group at Uppsala University, Åsa Cajander, got three project grants during one week! One side effect was that our research group moved to a larger office area to make room for new collaborators. If we get that lucky this time around I guess they might have to build a new house for us… 😉
Yesterday I blogged about a poster and a conference paper that Emma Frid and I developed for the SweCog conference in Uppsala. In this post I will focus on the second poster and paper that Eva-Lotta Sallnäs Pysander and I developed for the same conference.
The poster shown in the picture above, and even more so the paper, summarizes some of the main points made from my doctoral studies. My main focus during those years was collaboration in multimodal virtual environments with special emphasis on how haptic feedback can be used for communicative purposes. Mediated haptic communication has been studied for quite some time, but my specific contribution here has been to develop and test new functions for two-way haptic communication (see short descriptions of the functions on the poster) and also adapt some already developed ones in order to make them work better in a situation when a sighted person is collaborating with a severely visually impaired one in a collaborative virtual environment. There is a real potential in these kinds of functions when it comes to collaboration between sighted and visually impaired – the haptic feedback does not only enable establishment of a common ground about the interface but also effective two-way communication (see examples of results on the poster above). This is very important for the inclusion of visually impaired persons in group work. The example study is reported in much more depth in this article.
Even though the poster and paper include summaries of work already performed and reported, we are in this case even more explicit about the connection to other kinds of haptic communicative functions. This conclusion also takes the work to the next level:
We argue that for effective collaboration and communication to take place in virtual environments by means of haptic feedback the haptic functions need to be designed as to allow for reciprocal exchange of information. That is, both users need continuous feedback from each other during e.g. a guiding process or joint object handling.
The conference paper, on which the above poster is based, can be found here.
As I wrote in an earlier blog post I got two posters accepted to the SweCog 2017 conference in Uppsala, October 26-27. Unfortunately I got sick right before the conference so I couldn’t attend myself. The posters were, however, shown during the poster session.
The image above shows one of the posters – the one I created together with my KTH colleague Emma Frid. The study presented in the poster is based on the study I wrote about here, where Eva-Lotta Sallnäs Pysander and Roberto Bresin also participated. In the original study we found indications that gaze behaviour could be affected by haptic and audio feedback in a single user setting. In this new collaborative study presented in the poster, where we used a similar interface, we wanted to investigate if gaze behaviour can be affected by haptic feedback during collaborative object managing.
We have not performed the real experiments yet, but results from a pilot study with a few pairs of users (some worked in a non-haptic version of the interface and some in a haptic version) indicated that haptic feedback could have an effect on gaze behaviour (see e.g. the figures presented on the poster above). The results are not significant, but still interesting enough to make it worth running similar experiments with many more participants. A future step to take could also be to investigated how audio feedback (and/or combinations of haptic and audio feedback) affect gaze behaviour during collaborative task solving.
The poster above summarizes the work done. More information can be found in the published conference abstract which you can find here.
In an earlier blog post I wrote about my preparations for the Swedish Cognitive Science Society (SweCog) 2017 conference. My plan was to submit at least two papers to that conference and that was exactly what I did. One of the papers, “Using Eye-Tracking to Study the Effect of Haptic Feedback on Visual Focus During Collaborative Object Managing in a Multimodal Virtual Interface” I wrote together with Emma Frid and the other “Haptic communicative functions and their effects on communication in collaborative multimodal virtual environments” I wrote together with Eva-Lotta Sallnäs Pysander. I was first author on both since I led the work and did most of the writing. Earlier this week I got two emails from the conference organizers confirming that both papers had been accepted as posters!
When Eva-Lotta and I submitted the papers (you could only submit one per person) we indicated that we were aiming for oral presentations, but they were both “downgraded” to posters after the reviews. When it comes to first paper, written with Emma, I can understand it, since we were reporting on a pilot study and there were quite a few papers submitted by other researchers which reported on full-scale experiments and evaluations. The other one, on haptic communicative functions, were more theoretical in nature and in that case I think the main problem was the 500 words limit – we couldn’t really elaborate enough on our main findings, when most of the space had to be used to define and explain haptic communicative functions. Anyhow, I’m very happy that the papers were accepted and that we will be able to discuss our work with others during the conference.
The second confirmation email, about that paper on haptic communicative functions, actually included an interesting twist – one of the reviewers of that paper recommended that the paper should be presented by means of a live-demo during the poster session! That really came as a surprise (a positive one) and the organizers were really willing to work with us to make the live-demo happen. Unfortunately, one problem is that the studies referenced in the paper (about an evaluation and an experiment, respectively, during which pairs of users were collaborating by means of haptic and audio communicative functions – see this and this preprint article) used virtual environments based on outdated API:s that no longer work. I’m not sure that I can implement the environments using the newer haptics API Chai3D in time for the conference. But, no matter what, will still have the poster and the possibility to discuss and explain our findings.
So, the only thing remaining now (apart from trying to get a demo working) is to create two informative posters. After the conference I will get back to this topic and elaborate some more on the work presented on the two posters, so expect more posts about SweCog 2017 and my contributions two it!