communication · Pedagogical development · Social media in higher education

Time to start a new pedagogical project!

In earlier blog posts I have mentioned a pedagogical project I was involved in, related to the use of Twitter as a communication channel in higher education courses. I have been a little vague about it, since we are still waiting for an article to be published. Pernilla Josefsson, a Ph.D. student in Media Technology at KTH who lead the Twitter study, and I started to discuss the possibility of conducting a follow-up study based on Facebook more than a year ago but we could not find the time for a new study. 

Thanks to a pedagogical project course given at Uppsala University, which started September 1, we will now finally be able to start a one year pedagogical project on the use of Facebook as a communication medium in higher education! The idea behind the project course is to give teachers pedagogical course credits (= development time) to dig deep into a pedagogical development area of one’s own choice. Those who follow the course are expected to spend three weeks on the project during the period September 2017 – May 2018. 

When I came to the first meeting it turned out that I was the only course participant this year! I was also quite surprised by the fact that the course responsible, Amelie Hössjer, had research and teaching interests very similar to my own. During the first meeting we discussed the idea that Pernilla and I had prepared in advance as well as e.g. course goals. Questions discussed concerned what could be measured, benefits and risks of using teacher administrated Facebook groups as well as our role as researchers – should we be one of the teachers, or even the course leader, in the studied group or should we only act as passive observers? 

I have actually used Facebook groups (in which I have been the administrator) as a complement to other communication channels for several years in courses I have lead at KTH. I usually invite all involved teachers and all students directly after the first lecture. I have very positive experiences from using Facebook in this way, but I have never used the resulting communication in any research. 

The next steps are for Pernilla and me to discuss the input from the first course meeting and to choose a target course. I’m really looking forward to the new pedagogical project and to once again get the opportunity to collaborate with Pernilla! 

Academic writing · communication · Council · DOME · eHealth · Pedagogical development · Pedagogy

Recently applied for an associate professorship in implementation research

Last Sunday, I finally submitted an application for a position as associate professor in implementation research (didn’t have much of a choice since that was the last day to submit)! One positive outcome, apart from the obvious, is that I really had to think through what different roles I have and how I can make use of them. Since I have not written about all of them on this blog, I will list the different roles I came up with here (some of them will have follow-ups in more focused posts):

  • Researcher in multimodal communication and interactionI have already written about my thesis and quite a few other blog posts about haptics as an interaction modality. My main focus in this role has been to study how different modality combinations affect collaboration and communication in collaborative virtual environments
  • E-health researcherI have already written quite a lot about the studies I have been leading, regarding patient accessible electronic health records, since I stared my postdoc. What I have not yet written about is my earlier contact with healthcare – a quite intense collaboration with physicians at the Gastro department at Karolinska institutet during about two years of my doctoral studies. I will definitely write about that project later on.
  • Pedagogical development researcherA role I have not written that many posts about yet. During an extended period of time I e.g. took part in a study about Twitter use in a higher education course. I will come back to this when a paper has gone through the review process.
  • TeacherAnother role I haven’t written that much about. My teaching has focused on written and verbal communication in engineering sciences, haptics and human-computer interaction. I will definitely come back with blog posts on this topic, especially when it comes to master’s thesis supervision – my favorite teacher role.
  • Software developerI have not written that much about this role either, since it was quite a while ago that I actually developed an application. My focus in this case has been on haptic interfaces and haptic collaborative functions.
  • Member of eHealth councilThis is the newest role, which I wrote a blog post about a while ago – I represent “Education” and “patients” in the eHealth council at the The National Board of Health and Welfare.
  • Research network memberI have written about the DOME consortium several times, but I have not yet written about my participation in the “Nordic Network for ICT and Disabilities”, which specializes in assistive technology for people with deafblindness. I will introduce that network more thoroughly in its own blog post.
  • PatientI am a regular patient since more than a decade ago, and I have already used that in my own research on eHealth at e.g. conferences. This is why I add it as a “role” in this list. There will be plenty of blog posts from the patient’s perspective on this blog – that’s for sure!
  • BloggerNo comment…  🙂

I might have missed a few roles, but I think these are the big ones at least for the moment. As I said earlier, not all these are relevant for the position in implementation research but I started thinking about all of them as I was writing the quite extensive application. Writing this type of application forces you to really think about what you have done and what types of roles you have taken, and I really found it rewarding to reflect on this.

Earlier when I had written these kinds of applications I let them rest in peace and just waited for the decision, but this time I’m not going to leave what I wrote behind me and just hope for the best. This time, I will try to transform my sketched research ideas into funding applications as soon as possible. There will surely be more posts about that process!

communication · Group work · Haptics

More about my work with haptic communicative functions in collaborative virtual environments

img_1120

I’m currently in the process of writing a job application for an associate professorship and to make sure I don’t miss anything I recently browsed through my old conference folders to find the articles to append. When I came to EuroHaptics 2010 I was reminded about something I had completely forgotten – I was actually one of the organizers behind a workshop at that conference! I spent quite a lot of time preparing for the workshop, which focused on haptic communicative functions, but the weekend before the conference I got very sick and was forced to cancel my participation. I will take this opportunity to briefly introduce the workshop and discuss some of my work prior to it. My earlier blog posts on haptic feedback as an interaction modality were the following:

It was a shame that I could not attend the workshop/conference in Amsterdam, since it was based on my work on collaboration in multimodal environments up to that point. Thus, this was the perfect opportunity to discuss the work performed and get input regarding future work in the area. We described the focus of the workshop in the following way:

In this workshop, concrete examples will be presented and discussed in terms of how the touch modality can support communication and collaboration. Also, the technical challenges of distributed haptic feedback will be addressed. The target audience of the workshop is researchers and practitioners focusing on haptic feedback supporting people in settings where more than one user are involved. We invite other researchers and practitioners to share their research and experience from their different projects focussing specifically on the collaborative perspective. It might be that the collaborative aspects in your project have not yet been addressed. In that case, interesting collaborative aspects can be identified during the discussions in this workshop.

Quite a lot of work was performed by me and my “multimodal colleagues” 🙂 prior to the workshop. First of all, I had performed my master’s thesis work back in 2006 which focused on collaboration between visually impaired and sighted pupils in elementary school. Evaluations were performed in schools, where visually impaired and sighted pupils collaborated in dynamic collaborative environments where objects could be moved. During that work, and especially during a re-analyses performed during my first year as a Ph.D. student (2008) I realized that communicative functions based on haptic feedback had a real potential both when it came to supporting collaborative work and supporting inclusion of the visually impaired pupils in group work with sighted peers. It became especially clear that haptic functions for guiding (holding on to the same object or holding on to the peer’s proxy) can replace verbal guidance to a large extent.

Imagine a situation where you need to guide a visually impaired pupil to a particular place in a virtual environment. If you only have the visual feedback to rely on when establishing a common frame of reference, you need to talk a lot like  “go down, more down, …no, too much, go back…, now to the right…no, not down, up again… here it is!”. If you have haptic feedback available you can just grab the other person’s proxy and move the visually impaired peer to the right place and just say “here”. Needless to say, haptic feedback affects the dialogue between collaborators in this case. If you want to learn more about this explorative study you can read the journal article we finalized a few years after the study.

One problem that was evident from the evaluations with the visually impaired and sighted pupils was that the visually impaired pupil was not aware about what the sighted pupil did when haptic guiding was not utilized. This is why we performed a follow-up study where we added sound cues to the above mentioned dynamic interface to provide feedback on actions taken in the interface (e.g. grasping and putting down objects). We compared the new visual/haptic/audio version to the original visual/haptic one. We managed to show that the dialogue between the collaborators differed depending on which program version they worked in and it was also clear that the work was more effective (measured as time to complete task) in the visual/haptic/audio version. Once again, we could also clearly see how access to haptic feedback influenced the communication. You can read more about this study in this article.

These two studies resulted in a theoretical conference paper presented at HAID (Haptic and Audio Interaction Design) 2009, where we tried to develop a kind of conceptual model regarding haptic communicative functions’ effects on the dialogue between collaborators. This was my first attempt at a meta-analysis of the insights gained within this research area. The paper summarizes and discusses all the effects on the dialogue I had seen in my studies thus far. The paper made quite an impression – it is still the most referenced of all the papers I have produced up until today! At that point we were still quite unique when it came to haptic collaborative environments and I still think I’m one of the very few researchers who study the effect of haptic and audio feedback on the dialogue between collaborators.

The HAID conference paper laid the ground work for the workshop described in the beginning of this post and during the workshop the idea to study collaboration between two sighted persons was introduced and discussed. Next time I write about my earlier work I will introduce my latest study on collaborative multimodal interfaces, that showed that haptic and audio feedback indeed have effects on the dialogue between sighted persons as well!

 

communication · Pedagogical development · Pedagogy

About activating students at the university

Blåsenhus

About two weeks ago I wrote a blog post about a pedagogical course I took, which focused on supervising oral presentations. The fact is that I finished another pedagogical course the week after. That course, (held in the Blåsenhus building seen in the image above) focused on different methods used for activating students in the classroom as well as methods for making sure that the students really engage with all course material. The name of this 1.5hp course (given only in Swedish) is Aktiverande undervisningsformer and it included three whole course days filled with a mix of activities and lectures, and about two days of own work.

During our own work we should come up with a new way to activate students in one of our own courses. I will just provide a short version of my individual assignment solution here. In this case I chose the communications course, given to first year computer science students at KTH, since I have worked with it for almost 10 years and developed most of the content in it during the years 2008-2015. The students have always liked the practical exercises on oral presentations and Writing (the core parts of the course), but they have never really appreciated the more theoretical lectures. The course literature has never really been appreciated either and most students don’t by the course book. During my individual assignment in the pedagogical course I therefore focused on alternative ideas to present the theoretical material and I think the course inspired me to find a good solution (or at least a better one compared to lectures). My new idea is that some of the lectures should be transformed into literature seminars to which groups of students are given a chapter in the course book, and a scientific article related to the chapter, to present to the others. A few open-ended questions could be given to each group to help them prepare and to make sure that the most important aspects are covered in the respective presentations. Since students are already divided into exercise groups of about 25 students each, the same groupings could be used for the seminars. I also proposed a short quiz handed out at the end, with a few questions from each part covered during the respective seminars. This setup would force the students to read the course book and they would most probably engage more with the material. They would probably also learn more when being forced to explain the material to the others. The quiz at the end, which could add bonus points to the final grade, will hopefully make sure that everyone will listen actively to all presentations.

I’m not sure that the idea presented above is feasible and appropriate, but during a presentation at the end of the course, where all participants presented the main points of their individual work, both other course participants and the teachers thought that the idea was  good and should be tested in practice.  I’m not working with the communication course any more, but I will definitely forward my idea to those who are!

I thought I had tried the most when I entered the course, but even when it came to this course I left it with a lot of new insights. I especially enjoyed the parts about problem based learning (PBL) and flipped classroom, since I had never used those methods in practice. The course was very well structured and we were given a lot of time to practice quite a few methods while working in groups we were assigned to at the start of the course. The course really inspired me to rethink my own teaching practice and the transformation of lectures into literature seminars (where the students present theory and practice oral presentations at the same time) is an example of that. Now I just need to find a course where I can implement the flipped classroom approach, because I’m very curious about that method!  🙂

 

communication · Haptics

Haptic communicative functions

thesis

This is the fifth post in my blog series about haptics as an interaction modality and this time I start focusing on my main area – collaboration in multimodal interfaces. In earlier posts I have written about:

Ever since I started working with my master’s thesis project back in 2005 I have been focusing on collaboration in multimodal interfaces and specifically on how one can design haptic functions for collaboration and communication between two users working in the same environment. Collaborative haptic interfaces has been around for quite some time – one of the first examples is an arm-wrestling system using specialized hardware enabling two users to arm wrestle over a distance. Other commonly known, early, examples enabling a kind of mediated social touch are HandJive and InTouch. One thing these early systems have in common is that they use specialized hardware which is quite limited in scope. More examples of early systems and in-depth discussions about mediated social touch can be found in this excellent review.

During the two recent decades more widely applicable collaborative haptic functions for collaboration in virtual environments have been developed. Such functions can e.g. enable two users to feel each other’s forces on jointly held objects or make it possible to “shake hands” by utilizing magnetic forces between the two users’ proxies in the virtual environment. One of the earliest examples of an environment supporting these kinds of collaborative haptic functions, or guiding functions as I use to call them, is the collaborative text editor developed by Oakley et al.. Apart from the obvious functions needed to edit a document each user could also use a Phantom device to find positions of and/or communicate with the co-authors. An example of a haptic function was a kind grabbing function (similar to the shake hands function mentioned above), making it possible to grab another user’s proxy moving it to another part of the document. Other examples were a kind of “locate” function dragging one’s own proxy to another user’s proxy by a constant force or a “come here” function dragging another user’s proxy to one’s own position.

Later examples of virtual environments enabling these kinds of haptic collaborative functions are a collaborative drawing application developed at CERTEC, Lund University and applications for joint handling of objects evaluated by my former Ph.D. supervisor Eva-Lotta Sallnäs Pysander during her years as a Ph.D. student (thesis link). The latter examples are most relevant for me, since much of my work during my period as a Ph.D. student at KTH focused on collaborative interfaces in which two users can work together to move and place virtual objects. I will get back to my own applications later on, in a blog post about haptic interfaces supporting collaboration between visually impaired and sighted persons.

Above, I have just provided a few examples of collaborative haptic functions which can be used to control the forces provided by one or more haptic devices. The functions I think are most interesting to explore are the ones that enable physical interaction between two haptic devices in that both users can feel each other’s forces on jointly held objects or feel each other’s forces when holding on to each other’s proxies. These kinds of functions enable interesting means of communicating physically in virtual environments, especially in cases in which the users are not able to talk to each other face-to-face or point on the screen. Imagine, e.g. a scenario in which two users are exploring different parts of a complex multimodal interface showing distributed data clusters (what those clusters represent is not of importance here). In such an interface it would be very cumbersome to try to describe to the other person where a certain interesting cluster has been located. In this case the user who wants to show something s(he) found can grab the other user’s proxy and drag him/her to the relevant cluster. This possibility would probably simplify communication about the explored dataset (explaining where you can find details in a complex interface can be extremely cumbersome). This is, of course, just a made up example but it can be applied to many scenarios especially in cases where important parts of an interface are visually occluded. I will get back to joint handling of objects in a later blog post in this series. I will discussion the potential of using haptic feedback when exploring huge datasets in one of the upcoming blog posts.

Last, it is interesting to contrast functions enabling physical interaction between two haptic devices with functions only enabling a one-way communication (like the “goto”-function mentioned above). Using a one-way function enable some kind of communication in that one person’s proxy is “dragged” to another one’s, but the haptic function is only applied to one of the users in this case – there is e.g. no way for the other user to tell if the one being dragged actually wants to be dragged. When using a two-way haptic communicative function both users can feel forces from each other enabling a richer communication. Apart from enabling joint handling of objects, where the coordination of movement is made possible by both users feeling the other one’s forces on the jointly held object, two-way haptic communicative functions make it possible to e.g. clearly communicate to the other user that you do not want to be dragged somewhere. The potential these functions can have in situations where visually impaired and sighted users collaborate in virtual environments will be the topic of my next post in this series!