In the next year or so, hi-res projectors will break the $500 barrier and 50" LCDs will be below $1500. This begs the question...what will we do with them?
Last week, John Underkoffler, the big brain behind the Minority Report User Interface, dropped a demo exploring key questions in this space:
1. How can users interact efficiently with large displays?
2. How can gestures work across both multiple table and wall displays?
3. How can users effectively work with z-depth?
Outside of movies, real people perform tasks on computers that require finer manipulation than Underkoffler's UI allows (modifying documents, speadsheets, websites, videos, etc.) .
While his demo is a partial solution at best, combining his insights with an already ubiquitous device has real potential. Replace the black gloves with an iPhone.
This would enable "Minority Report"-style 3D gestures for gross manipulations...and a touch interface for precise tasks. (The iphone is perfect for this...it's a ubiquitous, battery powered, hand-sized, wireless peripheral with an accelerometer, a camera, and a small multitouch display).
Check out this tantalizing demo from MSG at Aachen University:
Underkoffler's ideas could enrich MSG's framework if it were expanded to leverage other hardware features of the iphone:
- The accelerometer could sense 3D gestures, similar to those seen in Underkoffler's video.
- Rocking the rounded back of the iphone on a table could create a mouse-like interaction, controlling x/y/orientation and velocity.
- The primary display could be chosen by pointing the iphone toward it (the built in camera would identify/confirm the position of the screens).
- The touchscreen could sense fine manipulations, provide contextual options and feedback.
What would you do with a bunch of really big TVs? ...and if you are interested in mocking up a demo like this, drop a line :~} Jonathan