From: Lorenzo Pastrana <pastrana@ultraflat.net>
Date: July 25, 2003 9:22:07 PM EDT
To: 3DUI <3d-ui@hitl.washington.edu>
Subject: Re: Ring through Wire VR Research


http://www.digitalartforms.com/index/IndexLarge.wmv
Sorry for the format.. :/ it's the only available

Lo.

PS: Actually I was quite impressed by the InDex tool they're presenting...
If anyone has a comment / critics, or even similar / alternative example,
i'll be glad to check.


PS:

As a matter of fact, what impressed me more than the two handed navigation
and manipulation (witch looks quite fluid and intuitive however), is the
ability of the application to capture the operator's 'possible intentions'
with a singular accuracy.

I'm concious that 3d modeling has _lots_ of 'notable' decision joints (I
mean : axis aligned, centered, bisectrice, orthogonal.. or whatever
geometrical singularity) and this shows up in a very spectacular way in
chopping or assembling parts. Mutch more than in some other contexts, say, a
farm management software, but the snap is so omnipresent and clever that
this produces a super-natural feeling .. Witch I think could be extended to
other manipulation intensive situations.

I was wondering if there is some sensible way/tool/diagrams to describe this
particular kind of behaviour-response. Is there some data-models that allow
that kind of flexibility ? I can think of some task oriented decisions trees
from gross to fine; something like Doug's taxonomy
<http://people.cs.vt.edu/~bowman/images/taxonomy.jpg> in an operational
fashion instead of descriptive, but there might be other methods.

If some of you are inclined to discuss or post some references at
specification and implementation level I'll be happy.

Thanks.

Lo.