CHItaly 2017: Marianna Obrist

The second day of CHItaly 2017 opened with a keynote speech by Marianna Obrist, Reader at the Department of Informatics, School of Engineering and Informatics at the University of Sussex, UK, and leads the Sussex Computer Human Interaction Lab (SCHI ‘sky’ Lab), a research group dedicated to the investigation of multisensory experiences. And this was indeed the topic of her talk today: how we can enrich human-technology interactions by exploiting all human senses and designing multisensory interfaces. That is, not only focusing on the most addressed ones, such as vision and hearing, but also our sense of touch, smell, and taste.

In her work, Marianna Obrist addressed this challenge systematically, considering three different applicative layers:

  • Perceptual space: what perceptual properties can we exploit?
  • Experiential space: what experiences can we create, and reuse, and in what contexts?
  • Technological space: which technology can we exploit to convey these experiences?

At the SCHI Lab, research has been done in several projects addressing these issues, mainly in the fields of gaming experiences and augmented movies, but also in more “serious” applications, like for example sensory substitution devices concerning not only the main senses, but also the sense of touch and smell.

Mid-air haptic experiences

Think for example about the feeling of raindrops on your hand: how can we reproduce such experience with the help of technology to enrich other experiences?
Back in 2013, a novel technology was developed at the University of Bristol: UltraHaptics, a technology to convey multi-point mid-hair haptic feedback above interactive surfaces. Now, UltraHaptics is a pluripremiated company that received multiple awards for innovation, but back then, Marianna told us that they had this cool technology, working perfectly, but were not sure how to use it to unfold novel and engaging mid-hair haptic experiences.

We know that we don’t feel the same tactile sensations in every part of our body. So, at the SCHI Lab they first created a full body map to understand where they could use this technology. They found that tactile sensations are poorer on the hairy parts of our body, but also that they are richer on the hand, face, lips, and other “hidden” parts Marianna preferred not to mention in her talk 😉

So, the opportunities for the use of these tactile sensations in virtual environments, or in medical, rehabilitation or psychological contexts, such as with people with autistic spectrum, are countless.

Russel

And also, another connection is possible, and this is the link between perception and emotion: if you can perceive something, you can feel an emotion about it. So now, the SCHI Lab is also working on the theory of emotions, mapping them in a bidimensional space, such as Russel’s circumplex model, which places emotions in a circular space defined by two axes: valence (pleasant VS unpleasant) and arousal (low VS high arousal).
They found that you can actually map specific emotions and associate them to a specific perceptual space of different tactile experiences (e.g., frequency, intensity) to create emotional tactile patterns.

An interesting field of application of this work is that of movies. With this regard, they worked with artists and producers to understand what kind of tactile experiences can be provided to the public, and they experimented with a one-minute movie:

Well, this movie, enriched with haptics, is so much better! Indeed, haptics offers a whole new way to integrate novel movie experiences.
But what else can we do with this?

Another promising field of application and reseaarch is that of sensory substitution: and here one question is, how can we exploit this technology to allow people with disabilities to understand complex science principle? For example, through the sense of touch. And this is one of the research issues that SCHI Lab will address in the next future.

Marianna then talked about a few other applications, involving art for example. One of these is the augmented experience they created at Tate sensorium, an exhibition of Tate Britain, again in collaboration with UltraHaptics. At the exhibition, the team used mid-air haptics to project ultrasound wave patterns on the palms of Tate visitors.

TATE Sensorium Flying Object

Another project involves taste. For instance, Marianna’s lab worked at a project with maitre chocolatier. Here one question arises: why should we integrate taste in HCI? She showed a short video presenting part of the work made in their project, where little quantities of ingredients where combined together in small “drops” of coffee, chocolate or hamburger, which tasted really like the original food, and could be used to try some food before actually eating it. The tasty floats project will be presented in October, so if you’re interested you will find more information about this soon online.

Again, emotions play a major role here, because taste is closely connected to emotions, and this is linked to our evolution. Indeed, in evolutionary terms, taste can be seen as a decision making mechanism that works through emotions: during evolution, negative emotions associated to certain flavours prevented humans to ingest something that could be poisonous.

pexels-photo-206347

There are five basic tastes:

sweet, sour, bitter, salty and umami,

which is a savoury/salty combination, typical for instance of soy sauce. At the SCHI Lab, they investigated this spectrum of taste using a psycho-phenomenological approach, and they came up with a set of characteristics of taste experiences:

IMG_20170920_094630.jpg

  • Temporal dimension: intensity, movement, duration;
  • Affective dimension: pleasant, unpleasant, neutral;
  • Embodied dimension: connected to the , that is how taste spreads in the mouth.

Another interesting sense that can be exploited to create compelling interactive experiences is the sense of smell. Smell is pervasive in our perception: 75 to 95% of what we commonly think of as taste actually comes from the sense of smell. And, in the UK, there are 5% of people with smell disorders. So, there are actually interesting opportunities to work with the sense of smell in HCI. In 2014, Obrist and colleagues categorized all experiences that can be associated with the smell to think of what can be done with it in HCI.

Think for example about what we can do with smell:

  • It allows us to retrieve happy memories
  • It excites us about new things
  • It builds up our expectations (think of baking a cake, you can smell it and can’t wait to eat it)
  • It warns of danger (gas smell, smoke)
  • It can invade our personal space (bad smell)
  • It helps us to relax

However, there’s currently a lack of standards and controllable parameters in current technology, despite a whole variety of technology available.
So, there’s a need to develop an HCI community working on these issues, to define common principles, standards and parameters, in order to be able to create tools to exploit this amazing technology.

After Marianna Obrist’s inspiring talk, the conference continued with five more sessions about augmented & virtual reality, interactive experiences, games, learning, and making and visualizing.
This was an interesting edition of CHItaly. I was reallly glad to see that it brought together different strains of research, in a very interdisciplinary fashion: I saw different disciplines converging in novel HCI research, not only computer science, but also art, physics, engineering, architecture, and also non-conventional application of HCI, such as in the skateboarding community. Altogether, this interdisciplinary combination promoted an active discussion on the future of HCI, and I really hope to have the change to participate again two years from now!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s