Future of computing devices where anything can become a touchscreen
This evening in the car, while on our way back from our friend’s place, my daughter and I were discussing how the coming times will see new ways of using technologies. I want her to be prepared to live in the world with no physical books and only handheld computers to interact with, probably with no physical keyboards as well.
I was telling her of the experiments being done where images of keyboards and keys are used to do what a normal keyboard does on a computer. So, it is just a projection of a keyboard but when you type on it (the image), it does what a normal keyboard would do if you touch those keys.
Well, what we seem to be looking is any surface becoming a touchscreen itself!
While surfing I came across this very interesting work being done at Microsoft to move in that direction with ordinary smart phones. Probably Windows 8 may be the bridge between older versions of Windows OS and the new world of Operating Systems. Here is a description of the work
Microsoft Research Redmond researchers Hrvoje Benko and Scott Saponas have been investigating the use of touch interaction in computing devices since the mid-’00s. Now, two sharply different yet related projects demonstrate novel approaches to the world of touch and gestures.
Wearable Multitouch Interaction gives users the ability to make an entire wall a touch surface, while PocketTouch enables users to interact with smartphones inside a pocket or purse, a small surface area for touch.
The attached video showcases the technology that Microsoft is using. MNN says the following of the technology:
The technology, which is primarily composed of a shoulder-mounted depth camera and a Pico-projector, was developed by researchers at Microsoft and is a vast improvement over previous prototypes that could only work on skin. A user study of the technology showed that dragging performance and touch accuracy approached the sensitivity of conventional touchscreens. The device also allows for user flexibility far beyond the capability of your phone or tablet. For instance, users can choose the position and size of the projected interface, meaning that they can adjust the size of the screen to better suite their needs and eyesight. OmniTouch is also itself capable of intelligently choosing the best display setting, so the user doesn’t have to readjust with each new surface.
In the coming years, maybe our smartphone-cum-computer (all rolled into one) would be a pin or a pen worn on our shirt pocket which projects the images of the touchscreen and captures what we want to be captured. All we would need is a hand to project on (even a leg or a tree would do!) and a hand to type on it with.