Tuesday, December 16, 2008

Hi all. Welcome to the Multitouch Book Club.

<-- Use table of contents at left for searching Book Club comments

Thanks for coming out to comment.

A special thanks to my friends, colleagues and fellow bloggers who have helped in getting the word out and to Dan Saffer who put together this amazing tome.

It’s so rich with content that I’m going to have to divide up the first 45 pages into several posts.

Each series of postings will cover ~10 pages of the book.
To make things easy to track, I’ll post an outline, like the one below before each new series of posts.

Outline of Pages 1-11

1.Multitouch Adoption is Happening at an Incredible Speed, pp. 1-2
----a. Multitouch has moved from Hollywood fantasy to geometric
-------product growth in a 6 year period

2.Types of Gestures: pp. 3-7
----a. Two Main Categories of Gesture Sensing Hardware:
----b. Two Main Interface paradigms
-------i. Direct Manipulation
------ii. Indirect Manipulation
----c. Embodied Interactions

3. History of Gestures, p. 7-11
----a. Started in 1982 at U of Toronto
----b. First touchscreens in restaurant order systems
-------in the late 1980s
----c. First PC HP 150
----d. There are a spate of “office of the future”
-------research projects in the early 1990’s
----e. Kiosks move into public facilities and retail
-------in the late 90’s and early 00’s
----f. VR gloves play a niche role in the gaming industry
----g. 2006, Nintendo introduces Wii
----h. 2007, Apple introduces iPhone
----i. 2008
------i. Other handset manufacturers introduce multitouch
-----ii. Jeff Han’s hardware is featured on CNN
----iii. Microsoft launches Surface

1 comment:

  1. Overall comment:

    This book is what I have been waiting for so long time. Even with the first chapter only, it is really helpful for me to get myself organized regarding gestural interfaces. I have two wish lists for the first chapter.
    1) Explicit classification of gestures: In my opinion, types of gestures may include touchscreen gestures, body motion gestures, 3D hand/finger gestures, pen (tool ?)-based gestures, etc.
    2) Relationship between other HCI paradigms: There have been many HCI paradigms such as ubiquitous computing, pervasive computing, organic user interfaces, and so forth. I think the common factor of those paradigms (including gestural interfaces) are the maximum usage of sensor technologies. And the gestural interfaces are the first of this kind that come into the real life while others still live in labs. I think that the position of gestural interfaces in HCI field can be clearly shown by describing this relationship.

    Detailed comments:

    p. xiv: His motivation to write the book is exactly same with the reason why I made my blog. I'm very happy to get the first book to try to collect and organize various aspects of gestural interaction.

    p. 15: Let me introduce my definition of touch event. For me, the following definitions have been useful when I implement touch interfaces.

    - Touch Event: An event that is produced during or after a user performs a touch gesture. An abbreviation of Touch Gesture Event.
    Event: A message notifying that internal states of a system is changed.
    Touch Gesture: A gesture that a user performs on a touchscreen or an interactive surface.

    p. 20: It is said that 100 ms or less is ideal for instantaneous responses. I think it would be better to replace 100 ms with 10 ms. When I used touch interfaces running at 30 ~ 40 hz (25 ~ 33 ms) systems, I got latency problem. From my experience, I prefer to use 100 hz (10 ms) system. 100 ms seems to be slow for most gestural interaction systems. When I analyzed handwriting motions, their peak frequency was around 20 - 30 Hz. That is another reason I usually stick to 100 Hz sampling rate.

    p. 25: For the paragraph "Number of participants", I think MERL's DiamondTouch could be a better example of detecting multiple users. The sensor system of DiamondTouch gives different electrical signals for different users while that of surface doesn't.