Back to Ivrea Home

Exit//End of Year Show 2005
Press releasePhoto gallery
Milan Furniture Fair 2005
Strangely Familiar Future


Press release
Touch Me
Tecno and Interaction Ivrea

Versione Italiana

Contact Information
Mailing List
Site Map

  Davide Agnelli
  Hernando Barragán
  Dario Buzzini
  Gaurav Chadha
  Mathias Dahlström
  Tal Drori
  Karmen Franinovic
  Eyal Fried
  Ivar Martin Lyngve
  Daniele Mancini
  Belmer Negrillo
  Valentina Novello
  Giorgio Olivero
  Søren Pors
  Aparna Rao
  Tarun Jung Rawat
  Michal Rinott
  David Slocombe
  Luther Thie
  Peggy Thoeny
  Helma Töpper

  Michal Rinott (Israel)
Expertise   Cognitive psychology / user interface conceptual design / prototyping
Education   Masters in Interaction Design, Interaction Design Institute Ivrea, Ivrea, Italy (2004)
MA in Cognitive Psychology, Tel Aviv University, (Tel Aviv, Israel (2001)
BA in Psychology and Economics, Tel Aviv University, Tel Aviv, Israel (1996)
Bio   Michal was for the last five years project manager and head of the web applications group at Pamam, a human factors and GUI design company in Israel. She created conceptual designs for software systems and implemented them in prototypes. Her MA thesis studied people's perception of experiences that extend over time.
Focus   Auditory interfaces, human visual cognition and information visualisation, finding new desired focuses.

27 May 2004
  Where did the inspiration for your thesis come from? How did you come to design what you ended up designing?
A central inspiration for my thesis, which is about audio-tactile interactions, is my background in cognitive psychology. What brought me to this field is an interest in human cognition and in how technological products can really be suited for people: designed for how we sense, perceive, think and feel. I am not only talking about being suited for how our system works, but also really building on our capacities, on what we are good at. That way, products can feel natural not because they are simple but because they are smart. Another inspiration was my love for music and sound; I always wanted to work with sound in some way and this project was a great opportunity. Last summer I did an internship in London. I would ride the tube every day and watch people. It was amazing to see the extent to which mobile devices are present in people's life. Many people on the tube listen to music with headphones and at the same time play around with their mobile phones.

So how did that lead to what you made?
I wondered whether mobile devices can use sound in more interesting ways. After all, we deal with these devices all the time. People are constantly writing text messages and becoming really skilled at it. Together with my advisors, Michael Kieslinger and Jan-Christoph Zoels, we asked: could that kind of interaction be richer than a set of buttons that you poke? Could a mobile device be like a musical instrument that you really learn to play? We wanted to see if sound and touch could be used with digital objects to create something that feels real in the material sense. When you use a pencil, it is a certain experience: a certain touch, visual, sound that create its 'pencilness'. We wanted to try to achieve this quality with digital objects. Mobile devices seemed a good test case because they are complex, carried on the body, involve hands and ears.

When you start playing with sound, chances are that you end up with a game. So the sound has to be very meaningful to the task or the activity to get around that.
You know, sound is being used in amazing ways in many design contexts. Games are one example; people also design amazing new musical instruments. I wanted to try to bring the same energy into the daily tasks that millions of people do with mobile devices. The main project for my thesis therefore is 'SonicTexting', a system for writing with gestures and sound. I started from the task - writing on a digital device - and investigated what that really meant and how one could "sonify" that. I discovered a gestural language called 'Quikwriting' that was developed at NYU for writing on PDAs. I created a version of that gestural language and developed a sound layer and a physical device for it. Together it becomes an auditory-tactile system for writing.

Just as fast as they would write with a keyboard?
My motivation was to make SonicTexting something enjoyable, not necessarily something super-efficient. I was looking for the qualities that might create a desire for mastering this skill. In the beginning it is not easy to use, because we are not accustomed to getting only auditory and no visual feedback. But quickly people catch on to it and it becomes quite fun, and people respond to it more like a game or an instrument. Some people become pretty fast and for them there is an "expert" mode in which the sound is optimized for fast use.

How does it work?
Together with Edoardo Brambilla, the workshop manager, we designed a little physical controller I call a "Keybong". It is an oval object with a small joystick in it and it fits in the palm of your hand. So you could write an SMS from inside your pocket. The controller is designed to support the gestures of SonicTexting: each letter is a certain gesture and each word is a collection of gestures. Moving the Keybong creates continuous sound feedback. There are different types of sound, but the main one is that as you move through the letter gestures, you hear phonemes, letter sounds. When you hear the right sound you release the joystick and the letter is written. It's all about hand-ear coordination in a synchronous loop: moving your hand, getting feedback in your ears - moving your hand according to your ears. It's fascinating to see how people realize that they need to use their ears. In the beginning they search for visual feedback, but intentionally there is none. Then they start listening, they get it, start using it and become good at it.

Is the prototype on a computer or already on a mobile phone?
It's not currently on a mobile phone. I use a microcontroller and a computer to create the real-time sound. But when I exhibit it, the setup is such that you don't see these elements. There is just the Keybong, the sound, and a visual scheme that shows you how to make the gestures and what you wrote.

How would that work on a mobile phone?
There are a number of scenarios. An immediate one is to use the little joystick that you have on a lot of mobile phones today, though the phone would need to have a better quality joystick and good sound. Conversely, with Bluetooth, the Keybong controller could communicate wirelessly with your existing mobile phone. But actually there is no reason why it should work on a mobile phone. The Keybong can be a wireless device for writing text messages: a specific device, with a specific character, designed for a specific task. With the feature that I call 'ReadBack' you can work without a visual display: after you write a whole word it reads it back to you in phonemes. So you could use it walking down the street, for example with headphones.

Did you also make other things?
Yes. Another project, developed at a lower level due to lack of time, is called 'Shake'. It was made together with Mathias Dahlström. Shake explores the idea of sharing music between portable music players (like the iPod): how one could share music with other people in the immediate vicinity. We created an audio-tactile interface for this kind of music sharing. It is a language of gestures, related to the natural gestures we use when communicating. You perform these gestures with your music player in order to let someone listen to your music, or listen to theirs. The gestures create real-time changes on the music that you are hearing, slightly modifying it to enhance the experience of the gesture and its meaning.

You are bending your hands forwards.
Yes, that is the gesture to offer my music to you, like I am offering you a piece of candy. And it sounds as if my music is trying to get out, trying to go in your direction.

And now you are bending them backwards.
That's when I am asking you to give me your music. And if I want to get rid of music I don't like, I can shake it off, and the music dies out according to my gestures. We mapped the whole vocabulary of the music-sharing interaction into gestures and sound; but there is still a lot to do on the prototype to make it complete.

Will you take these explorations forward now into a product, or is this research meant to provoke further explorations?
My original intent was more research oriented: I wanted to show the potential of audio-tactile interactions. I chose tasks that were relatively complex - especially 'texting', which is such a visual activity - thinking that if I could pull that off it would be a good demonstration of the potential. Which I think is very high.
However after exhibiting SonicTexting there has been some interest expressed in making it into a product, and I do think it is possible to push it forward. There are some decisions to be made; I guess the next months will tell how things will proceed.

What has made the biggest impact on you here in Ivrea? What made you different as a person, as a designer? What left its mark?
The really amazing experience has been getting to know and work with the people here - people from different disciplines and different countries. Discovering them, their talents and strengths, and at the same time my own. When I decided to study here, I didn't realise how much this is a design school. I thought that the triangle of design, technology and social sciences would be more balanced. On the first day of the school year, Casey [Reas] gave us our first assignment: "Please draw an object from four different perspectives". I thought: "Draw?" At that moment I realized what I was going to go through. As someone from a different background - cognitive psychology - this journey into the design world has been really exciting, and the fact that I have started seeing myself as a designer is a meaningful thing for me.

What are your plans now?
It's still very open. My previous experience, both before Ivrea and during my internship at IDEO, has been in consulting firms. This is a form of work I like, but I am also curious to try others. I would like to pursue this area of sound and audio-tactility, also by applying the insights from my thesis to other areas. I'm also looking forward to dealing with other interaction design topics...

I'm sure you will find your way. Thank you.

(Interview by Mark Vanderbeeken)

< previous | next >

Exploring the relationship between hands, ears and mobile devices
Thesis Project
view gallery

Visions of Video Communications
Applied Dreams Workshop 2003
read more

Who Controls the Controls?
Control Mania Course 2003
view project site

The Bar Plays Itself
Building as Interface Course 2003
view project site

Designing for the Body Course 2003
view project site

Auditory Comfort in Talponia
Social Business Innovation Course 2003
view project site

Physical Computing Course 2002
view project site