Researchers at the Hasso-Plattner Institute, Germany has demoed a prototype concept called the Imaginary iPhone. Yes, you heard that right. These researchers have been working on a system that could revolutionize the way we use our iPhones. The idea behind this is simple; to be able to perform actions on your phone such as taking a call or simple navigation gestures just by tapping your palm without the need to take the phone out of your pocket. Your palm will serve as the imaginary screen for interaction with the device. Now how cool is that ?
Research Before The Concept Was Put To The Test:
In a study that has been submitted to the UIST conference in October, researchers state that a majority of iPhone users can accurately recall the position of about two-thirds of their iPhone apps on a blank phone and with similar accuracy on their palm without the need to even look. The position of apps used more frequently was recalled with up to 80 percent accuracy. Therefore, they argue that there's a good case for coming up with a system which lets human palm emulate the touchscreen and eliminate the need to hold the device. This solely relies on the accumulated muscle memory that smartphone users have grasped over time.
How Does It Actually Work?
This unique concept is demoed by taking the iPhone as a smartphone example. Just by moving a finger on your palm, you will now be able to control your iPhone that's kept inside your pocket. The Imaginary iPhone determines which iPhone app a person wants to use by matching his or her finger position to the position of the app on the screen. This concept works with the help of a depth-sensitive camera which picks up the tapping and sliding interactions on the palm. This depth-sensitive camera is mounted on to the head of the user to capture the movements the user makes on the palm. A special computer software running at the background then picks up the video sent by camera sensors and runs it through an interaction analyzer to interpret what the user's action would actually be. It then correlates the finger gestures with the position of icons on the user's iphone screen. Once analyzed, the instructions are then relayed back to the iPhone through a wireless radio. The corresponding gestures then gets executed on the actual iPhone itself.
The highlight of using a depth-camera isn't something new. Microsoft uses similar cameras in their Kinects for the Xbox. The cameras used for the prototype research are intelligent enough to subtract the background and is capable of tracking only the finger gestures. Current limitations are that for now they work best only in well lit conditions like direct sunlight and needs to be set on a tripod because of their bulky weight.
At the moment, the prototype still involves plenty of heavy equipment, but researchers hope to eventually incorporate a smaller camera that users could wear more comfortably. The imaginary phone prototype serves as a shortcut that frees users from the necessity to retrieve the actual physical device, says Patrick Baudisch who headed this research project together with institute students Sean Gustafson and Christian Holz. [via Engadget, TR]Contact Us for News Tips, Corrections and Feedback