UIST 2010 held in New York this year. It was really great experience for me. We had great keynote speakers and paper presentations. Also, this years student contest was working on some really cool LCD keyboards(Microsoft) for the students. They can easily turn in to nice game controllers or assistive technology devices, as well as nice web interface controllers.
Keynote speakers of the conference were Marvin Minsky (The Interested Interface), Natalie Jeremijenko(Connected Environments), Jaron Lanier(The Engineering of Personhood). It was really good to hear them since their topics were almost completely different then papers that were presented.
Rest of this post ill be about some of the demos or papers that I really liked and can be inspirational.
Imaginary interfaces is an interface that user draws/writes on the air, by using his her right hand thumb and index finger as boarders of a paper. Their drawings/writing are send to other users screen. This is useful when you are describing something over the phone. Technology behind this application is an IR camera on the users chest that detects users hand gestures.
Hands-On Math: A page-based multi-touch and pen desktop for technical work and problem solving
This paper was about an application that helps people to lear math. This idea is based on advantages of black boards which is gathering around it for discussing a formula or drawing/writing. I think these elements are helpful for learning. Hands-on-Math is a touch screen interface that works with pen. It is designed as a math interface. User can write formulas, scale formulas. By touching to one with and using other finger they can actually change the scale of formula. I really like the new gestures that they are using for interaction.
You can check out their power point in here.`
Pen + Touch = New Tools
This is a paper from Microsoft Research Labs. They are trying to bring the pen input to multi-touch screen experience. Also, they don’t see this is a base for the interaction, but they see it as an additional input that you can use with fingers. Pen can be used whenever it is necessary such as while you are reading a book from your iPad. It is great to take notes or underlying stuff with your pen.
TurKit: Human Computation Algorithms on Mechanical Turk
I think this is a really interesting idea. In this paper, they are explaining ways to use mechanical turkers in an algorithm to get the best results. They have a simple example with a hand written paper. In the first iteration no one can read the paper. However, by changing some restrictions, they are getting positive results in few iterations.
MAI Painting Brush: An Interactive Device that Realizes the Taste and Feeling of Real Painting
This is a painting device that lets users to make virtual paintings in 3D space. User is using virtual reality glasses and painting by using spacial brush. Also, user can change the tip of the brush which seems really helpful.
SqueezeBlock: Using Virtual Springs in Mobile Devices for Eyes-Free Interaction
This is an interface that gives user a tactile feedback by changing the strength of the springs. They see this device as a phone interface but I think there are other useful ways to use it.
You can see their paper in here.
Gilded Gait: Reshaping the Urban Experience with Augmented Footsteps
This paper is about giving tactile feedback(vibration) to user through his feet. By using accelerometer, switches etc, It lets user to feel how the surface like. I think this sort of feedback would be also useful for people who has visual disabilities. You can easily implement a GPS device on top of this and just help them navigate.
Jogging over a Distance between Europe and Australia
This is a really cool idea to connect people in different locations. People usually don’t like running alone. So, this application is trying to connect people in real time over the phone. According to users speed they can hear each other from behind or front by panning. However, one of the down side of this project, I really don’t like talking while I am running. It really effects runners breath.
FootLoose: Exploration and implementation of practical accelerometer-enabled foot gestures
I think this one was one of my favorites and I think closest one to ITP. It is an iPhone app that uses accelerometer as an input for controlling the phone. Let’ say that you are carrying lots of bags and you cannot answer to your phone. You are already using your head set, after few some dance moves, you can actually answer your phone and talk to your friends. I feel like, some one can make really nice iPhone games by using this idea that is controlled by dance movements.
Multitoe: High-Precision Interaction with Back-Projected Floors Based on High-Resolution Multi-Touch Input
This is a great idea and seems like they have gone through a lot with this project. I think this could be really fun and useful in many public spaces like science museums.
Combining Multiple Depth Cameras and Projectors for Interactions On, Above and Between Surfaces
I think this is the most fascinating project that I saw during the UIST. It is based on 3 3D cameras and 3 projectors in a room. By using the 3 different 3D camera angle they are able to track the user and other users in the room. Also, by using the projector, they are able to turn the whole room to a computer. It makes it really easy to create interaction in 3D space Check out the video for the rest.
MudPad is a nice user interface that gives users tactile feedbacks by using localized vibration. I can see this product in many assistive technology projects or games. For more, click here.
Pinstripe: Eyes-free Continuous Input Anywhere on Interactive Clothing
unfortunately, I couldn’t see the real demo for this project. However, their paper sounds really cool. It is a way to detect the area that you are pinching on a garment. They are claiming that, they can make it as large as they need. For more information, here is a link for the paper.