from the unintended-consequences-abound dept.
kkleiner writes "Skinput is a system from Carnegie Mellon's Chris Harrison that monitors acoustic signals on your arm to translate gestures and taps into input commands. Just by touching different points on your arm, hand, or fingers you can tell your portable device to change volume, answer a call, or turn itself off. Even better, Harrison can couple Skinput with a pico projector so that you can see a graphic interface on your arm and use the acoustic signals to control it. The project is set to be presented at this year's SIGCHI conference in April, but you can check it out now in several video demonstrations."
The only possible interpretation of any research whatever in the `social
sciences' is: some do, some don't.
-- Ernest Rutherford