14 November 2010

Manipulations vs Gestures : Is there any future for manipulations ?


I frequently hear questions about multitouch systems and direct manipulation : translation / rotation / scaling of objects is cool but what else ? Should we consider abstract gestures as the future of multitouch ? Does it means we’ll have to learn these gestures as CLI users must learn abstract commands ?

Manipulations & Gestures in OCGM model
For this post, I will retain definition from OCGM model by R.George & J.Blake:
  • Manipulations are direct, continuous actions with immediate feedback while the Manipulation is occurring.
  • Gestures represent an indirect, discrete, symbolic action.
Manipulations can be seen as  a metaphor for manipulations of physical objects and gestures as a metaphor for non-verbal communication. With Gestures, the important part is the actual movement of the Gesture. With Manipulations, the movement itself is not important (unlike Gestures) since you could take many paths to get to the same end result.

Some typical factors implied in manipulations
The physical factor
If we consider manipulation as a metaphor for manipulation of physical objects, we can already say that the concept of manipulation will be strongly dependent on physical constraints such as number of spatial dimensions or number of degrees of freedom of the effector and objects handled. Thus, a system like KINECT, which detects human movement in 3D space, provides a larger set of manipulations than multitouch surfaces which implement 2D interfaces and use detection of touch points between hands and interactive surface.

The cultural factor
Manipulations are quite insensitive to cultural factor compared to gestures. Indeed, the meaning of a gesture in one country may be totally different from its meaning in another country or may have no meaning at all. For their part, if manipulation techniques may also vary depending on factors such as environment or available resources, the physical factor is usually dominant in the definition of manipulation.

The skill factor
The concepts of learning and expertise are applicable to manipulations. There is many ways to perform an operation, a whole range of motion, from the beginner’s hesitant movement to the expert’s perfect manipulation. Gradually, as the movement is repeated the manipulation is refined.

Is there any future for direct manipulation in surface computing ?
Missing parameters
Current multitouch devices (software & hardware) are far to take advantage of all physical parameters that come into play during a manipulation. For example, very few devices use the concept of pressure, even if it's an important factor which opens new possibilities.



Note that scale associated to manual skill factor could potentially increase as the number of physical parameters and the resolution of these parameters increase. May be in the future, being an expert in computering will mean you have intellectual and manual skills.

Symbolic manipulations
We can also consider some other types of manipulations, which depend on the number of touch points. This is the case of 10/GUI Con10uum model (an implementation by M.Dislaire can be found here).


Con10uum and Con10uum side by side from Improveeze on Vimeo.

However, this type of solution raises several questions for NUI approach.

From our experience we associate manipulation of objects to the ergotic function of hand (its ability to transform objects). Our experience has also taught us that ergotic function is strongly linked to physical laws involving forces (gravity, contact forces, frictional forces, ...) and we’ve learned to incorporate effects of these forces since our early years.

But when we observe a system like 10/GUI Con10uum, the first finding is that physical laws don’t seem to apply and seem to be replaced by symbolic laws : a manipulation done with 2 fingers leads to a different result from the "same" manipulation performed with 3 or 4 fingers. The semiotic function of hand (usually associated with gestural communication) becomes dominant in the definition of these manipulations. In fact, within such a system, the manipulations appear to have a hybrid status, halfway between the manipulations and gestures of OCGM model. This is why I used the term symbolic manipulation.

Success of this kind of system will depend on definition of meaningful symbolic manipulations, easy to learn and memorize.

The forgotten Tool
Let’s take a while and let’s see what our ancestors have built with the help of manipulations.
They’ve built stuffs like this
Notre Dame de Paris
or  like this
Kheops's pyramid
To be exact, an additional ingredient was needed to obtain these results: Tools.

If we consider the concept of manipulation as it appears in the nature, we can distinguish three stages :
  • Direct manipulation of target object (skill shared by almost all animals)
  • Manipulation of a mediator object to interact with target object (skill shared by some animals like monkeys, …)
  • Manufacture and manipulation of specialized tools to interact with target object (activity highly developed by humans).
The manufacture and manipulation of tools is in fact a key feature of human activity ("Homo Faber" by H. Bergson, "Toolmaking Animal" by B. Franklin). Just think about how many times a day you use a tool to realize an action (writing, eating, ...).

Tools and Natural User Interfaces
It appears that concept of tool should have a role to play in NUI. Indeed, if we think reification of objects and direct manipulation of prime importance for NUI, it is hard to imagine depriving the user of tools, which forms a key feature of interactions between human and objects in daily life. We musn’t think to tool as a generic concept ("my text editor is not a tool") but in a reified form. As a specialized object used in specific contexts. As a mediator object used to perform an action and obtain an accurate result. For those interested in the subject, M.Beaudouin-Lafon’s works on instrumental interaction and B. Buxton’s works on See-Through Tools are a great source of inspiration.

Tools and Bottom-Up approach
The concept of tools has also an essential role to play in what we called Bottom-Up Architecture and tools should find a special place for the implementation of the concept of Operators.

To conclude
The concept of manipulation still seems to have a bright future. Beyond the inclusion of additional physical parameters such as pressure (which depends on the hardware solutions available), the development of reified tools could be a promising way, especially in the context of a NUI-oriented approach.

No comments:

Post a Comment