ETSI workshop Multimodal Interaction on Mobile Devices

2008 November 21

ETSI workshop, Sophia Antipolis, France
18-19 november 2008

Website: www.etsi.org
Presentations files

ETSI presentation:
European (now global) telecom standards institute.

Laurence Nigay, Grenoble University:
Open Interface project investigator/leader
Application areas: games and large information space + ixd
OI is an open source framework for prototyping
Continuity between mobile and environment
A framework used to explore the design space
Repository of modalities, 56 available now.
Contains many classes of components: iphone, shake, wiimote, omni
OI mobile phone example
Only input of the mobile phone
Can be switch between many devices easily
Gesture to link virtual world and the real world
Multimodal equivalence for the task, but not to the user ****
Passive vs active modalities
Latency can be an issue

Ingmar Kliche, Deutshe Telecom Labs:
Multimodal user interfaces
“No fault Found” in 1 of 7 devices returned by users
Bad usability
DTL -> staff of 40-50 working on intuitive usability
Problem: no standard OSes, 4-5 different ones
Challenges: awareness of multimodal capacities, how do you know it’s available
How to introduce the system to the user? Manual, training, videos (done at DTL).
How do you know which (vocal) commands are available?
RIM representatives:
Matching multimodality to appropriate tasks/categories
Date rate vs Battery life vs Receiver complexity
Pushing for open standards for home network (now mostly closed and proprietary)
Take away for this presentation:
1) multiple radios
2) adapted to our multimodality lives
3) inputs/outputs switch depending on the task
4) harmonize with APIs
5) creativity is boundless
6) what about olfaction and taste?

José Rouillars, Lille:
Plasticity of interfaces
Washing machines with voice interface [ridiculous]
Multi devices
Multi channels
Multi modal
Context-awareness? possible?
Project with QR codes augmented with contextual information
Robert Van Konmer, Swisscom:
Roadmap of telephony interface
Pattern -> input search, output list text/images/others
Freedom
Robustness
Efficiency
Improvements
MUI ???
Instead of modeling the user, let the user model himself
Model is own by the user
Implicit vs explicit

 

Charlotte Magnusson, Lund University, Sweden:
Usability
Get rid of mistakes early
WAI guidelines not perfect, lots of problems even if we follow them.
Gesture
0,2 – 20 ms for haptic, 5 kHz for state of the art Haptic
Francoise Petersen, France:
Standardization of the user preferences
Explicit vs implicit
Multitel,
Multimodal Hub, newer implementation of OI, runs totally on mobile device
Mobile barrier, no support for external libraries at runtime.

Roope Takala, Nokia, Finland:
Standardization Issues for Mobile Haptics
Natural human mobility systems of the fused physical and digital world.
Key enabler is multimodality
We need a notation system for haptics
Establish what you want to achieve -> should be independent of the device or the hardware implementation

Comments are closed.