eNTERFACE08 logbook

2008 August 31

eNTERFACE08, summer workshop on multimodal interfaces, Orsay, France, from August 4th to August 29th, 2008

website: http://enterface08.limsi.fr/

our project: Project 4 , Design and Usability Issues for Multimodal Cues in Interface Design/Virtual Environments, http://enterface08.limsi.fr/project/4

Catherine Guastavino
Emma Murphy
Charles Verron
Camille Moussette

Week 1

Monday

Late afternoon introduction. Presentations of the workshop and the projects. Room assignment. We are requesting a new office and testing space since we need a silent environment for our experiment. It seems to be a challenge to find such a place. 

Tuesday

Trying to get organized in our space. We get a PC from the staff. We install and try to configure McGill’s Ripples application. We are not really successful at running the thing. We will try to email the author back in Montreal. I also start exploring H3D, an open-source Haptic package from Sweden that I discovered at a conference in Madrid last June. It looks simple and easy. 

Struggling a lot to get my RER pass. After many trials and a lot frustration, I finally have it :-)

Wednesday

Our PC machine is giving us errors on the firewire bus. Old code from Charles seems to be working on his machine. We are discussing how to build our experiment: McGill’s Ripples app, Charles’ code or H3D. H3D looks promising but we have problems installing it properly on the PC. We couldn’t properly installed Ripples and its required libraries. We decide to split the haptic and audio to ease development. The haptic part will send values via OSC to MaxMSP or PureData for sound feedback.

Afternoon presentation by Lotfi Zadeh titled “Toward Human Level Machine Intelligence—Is it Achievable? The Need for a Paradigm Shift”. Very interesting stuff about fuzzy logic and trapezoidal distribution. We need to focus more on computing with words if we want to pass current AI’s problems. Still not so clear how it translate to Machine Intelligence. 

Thursday

 

Friday

Strungling with PCs and H3D to get everything working properly. We tried with various machines and we found out that we need OpenHaptics from Sensable installed to have a working Phatom. We are back to our main test machine and 

 

Week 2

Monday

I’m the only one of the team at LIMSI. The target is now randomly located in X and Y on one of the 5 planes. I can trigger event and timestamp once the target is touched by the user. Ordered the Novint Falcon from Sweden. It should arrives early next week. Learning Python for some hours during the afternoon.

 

Tuesday

The audio is now working with Charles’ MaxMSP patch. I send postion values via OSC and the audio is spacialized with SPAT. The coordinates system is quite different on the haptic device and in SPAT. We have to figure the proper conversion. The final working solution seems to use on Spherical coordinates. We introduce a scale factor to augment the spacialization of the sound. It works quite well from our initial test. Horizontal planes configuration only for now.

Great organ concert in the evening. 

 

Wednesday

We are having an issue with running the test with horizontal planes. When we apply a rotational transform, our absolute coordinates are not good in MaxMSP. We are querying the absolute device’s position but our virtual world and device aren’t calibrated. We tried specifying a calibration matrix but without sucess. I implemented a trigger to only activate sound and haptic feedback after a button press. It’s quite easy to do within python. 

Horizontal planes configuration is still not fully working.

Nice dinner with Catherine and the others near Denfert-Rochereau station.

 

Thursday

A very good day. The setup/experiment is fully working. We have a master python script that takes care of randomizing and launching the runs for each user. Each of the four conditions corresponds to a set of files (X3D and its associated python script). All data is save in Excel files inside a folder on the machine. Python is great for gluing all the pieces nicely together. The local script compute initial target distance and the total traveled distance. It was super easy to do and dump to a txt file. It will makes it a lot easier to analyze our data in the end. A test user ran 40 trials in about 20 minutes. The application is crashing from time to time, but it’s manageable so far.

Video of the experiment.

 

Friday

Day off. No work

 

Week 3

Monday

We are moving the experiment in the acoustic booth to start our testing. It is still all running fine. Great! We are starting our week long testing. Each participant will be introduced to the system with 8 trial runs, then they are invited to complete 44 runs. 

The new Novint Falcon haptic controller arrived today. I tried running in on Windows (virtualized on my MacBook Pro) without success. The driver and demos installed fine on a Windows machine. The working boundaries seem to be a bit smaller compared to to the Phantom Omni. On the other hand the forces are much stronger and the controller seems less “mushy” with hard collisions. I guess the mechanical design of the three arms is helping a lot. Our files/trials don’t work out of the box with the Falcon. I’ll investigate more tomorrow. We had four users today, for a total of six so far. 

 

Tuesday

We ran seven users today! We are totaling 13 runs now. Emma presented our project at the mid-term presentations session. Many of the teams have really interesting projects but they are so ambitious. A team is doing sign language tracking and recognition. They have an avatar that answers back in Czech sign sign language. Quite impressive. Ours is pretty limited but we are exactly on schedule with a full week dedicated to testing this week. I can’t fiddle with the Falcon as the PC is acting up, rebooting continuously. We are very lucky with PC machines at this workshop. One out of five is working adequately. 

 

Wednesday

More testing with seven participants today. We are now totaling twenty participant. We expected it would take a full week, but it only took three days. While Emma is running the trials with the participants, I’m working on a viewer to visualize our data and users’ trajectories. I decided to use Processing and the results are great considering I only spent 2-3 hours on it. First I had a scripted POV (point of view). In the evening I found details how to use the mouse to change the POV on the fly, like all good 3D viewer. It works really well. It’s time to get deeper and make it fancier and useful (like reading and/or converting data directly from Excel files).

 

Thursday

Final day of testing with three last participants (for a total of 23). We took the setup down and out of the acoustic room around lunch. The weekly conference was titled “Fusion and fission in multimodal interfaces: using ontology-based semantics for representing semiotic modes”. John Bateman from Bremen University talked about how each modality has its own unique qualities and possibilities and moving from one to another, or combining them should be better coordinated (intelligent multimodality). This should help minimize the ‘semantic gap’ that appears between low-level signals and intended interpretation. Check slides for more. 

 

Testing notes: 

Session with CM003: The participant is not aware or exploring extensively the the y axis. In the Vertical audio configuration, she would associated the plane sound with the location of the stylus. When navigating around and leaving the target plane, she would automatically go back to the position where the sound was first initially heard or found. She wouldn’t explore the area very much this way. 

One crashed trial, 2 closed after timeout. 

 

Friday

Day off. No work today.

 

Week 4

Monday

Last week in Paris. I continued working on the viewer to visualize the data and trajectories of the trials. I built a Python script to convert the Excel file into a text file I can read and manipulate more freely in Processing. The viewer is coming along nicely. I have some issues with scaling and orientation of the data, it needs to be looked into tomorrow. Charles showed up during the afternoon and we started discussing items needed for the report and an eventual journal. We haven’t hear from Catherine but she knows we got all the data (23 participants) last week. One PC was reformatted by the computer assistant and I was able to quickly setup and test the Falcon. Again, it seems more powerful (stronger, more rigid, capable of generating stronger feedback) but it’s working space or active bounding box is more limited than the Phantom Omni. 

 

Tuesday

I continued working on the viewer and the scale and orientation seem ok now. We are using many different referential systems and it’s a bit confusing moving and interpreting data across them: relative 0 to 1 in the X3D scene file, non-calibrated absolution position on the device, spherical coordinate system for specialization of the audio in MaxMSP + SPAT. The calibration is resolved with a factor 2 multiplyer (see H3D documentation for Phantom device). I still have a mysterious half PI value needed to fit the radius of the sphere to our experimental data. Anyways, the viewer is displaying data and trajectories in a coherent fashion. 

During the afternoon, we recorded many videos of the experiment. We tried to capture various angles and the different hand grip used by the participants. We recorded a couple more trials with one participant. Hopefully we can synchronize the video and the viewer to really have an interesting analysis.

Late evening I added animation of the trajectory in the viewer. It was very simple to do and it adds a lot to the appreciation of the runs. 

 

Wednesday

Last day at the workshop. I improve the viewer with last minute functions: loading a new file, changing the color, more position and status data showing in the screen,etc. I’m pretty happy with it. It’s almost usable by me colleagues. There is only one glitch that we need to manually fix in the Excel files. The configuration info is missing (a hardcoded value is present in all the files). We can get this info from the summary files. It will only requires a painful 

 

Thursday

Friday

Comments are closed.