Bibliography on Spatial Interfaces
(converted to HTML format from Ken Hinckley spatial bibliography)
This is a fairly comprehensive, but far from
complete, bibliography of
materials related to 3D interaction, based on the references list of
our paper "A Survey of Design Issues in Spatial Input" (Hinckley,
Pausch, Goble, Kassell) as published in the ACM UIST'94 Symposium on
User Interface Software & Technology, 1994, pp. 213-222.
|(16-21 August 1996). Symposium on
"Human Bimanual Specialization: New Perspectives for Basic Research and
Abstract listing for presentations to be given at the symposium.
(1990.). . The Art of Human-Computer Interface Design. B. Laurel and S. Mountford. Reading, MA.
(1995). Virtual Reality: Scientific and Technological Challenges. .
Committe on on Virtual Reality Research and Development National Research Council
(Feb. 1993). Three Views of Virtual Reality. Computer.
(Nov. 1994). Frederick P. Brooks, Jr. Receives ACM Allen Newell Award. Computer Graphics 28(4).
News item along with adaptation of Brook's SIGGRAPH'94 acceptance speech.
Abel, K., M. Alic, et al. Rapid Construction of Model Building for Urban Combat Mission Rehearsal Databases (PolyShop TM). .
available at http://www.vsl.ist.ucf.edu/~polyshop/polyshop.html
Adelstein, B., E. Johnston, et al. A Testbed for Chacterizing Dynamic Response of Virtual Environment Spatial Sensors. Proc. UIST 92.
This paper describes a testbed for measuring the latency of spatial sensors, but unlike Liang et al. [UIST '91 paper] it does not suggest specific filtering methods. Unlike previous related work, this study measures the performance of the sensor alone. Factors such as code execution time, inter-process communication time, and rendering time do not distort the results.
Agronin, M. The Design of a Nine-String Six-Degree-of-Freedom Force-Feedback Joystick for Telemanipulation. Proc. NASA Workshop on Space Telerobotics, 1987, pp.341-348.
Haptic Displays: A six-degree of freedom force feedback joystick. paper basically explains the joystick and how it works. Goes through the physics equations for its motion very painlessly.
Annett, J., M. Annett, et al. (1979). The Control of Movement in the Preferred and Non-Preferred Hands. Quaterly Journal of Experimental Psychology 31: 641-652.
hand comparison in unimanual peg-board transfer task. Found difference between hands was greater at the smaller tolerances. suggests that nonpreferred hand is simply more noisy than the preferred
Badler, N. I., K. H. Manoochehri, et al. (October 1986). Multi-Dimensional Input Techniques and Articulated Figure Positioning by Multiple Constraints. Proc. 1986 ACM Workshop on Interactive 3D Graphics. Chapel Hill, NC: 151-170.
This paper describes an attempt to add multi-dimensional input, using a Polhemus tracker, to an early version of Badler's "Jack" articulated figure positioning system. The Polhemus ("wand") was used in two modes: absolute and relative. Absolute positioning was fatiguing. Relative motion allowed the user to move the wand (by releasing the button) when an uncomfortable position was reached. Orientation was always absolute. The implementors thought that the consistent coordinate systems of the wand and their "test scene" would allow intuitive movement, but this was not true. Lack of depth perception ("spatial feedback") on the 2D display made it difficult to select a target; also, simultaneously positioning and orienting the wand proved to be challenging. They tried decoupling wand parameters, but results were still not satisfactory. Using the wand to position a virtual camera was more successful but it was still a consciously calculated process. The implementors found that using a real object as a spatial reference for 3D wand interactions yielded a "natural and effortless" interface. The real object provides the true depth information lacking in the 2D display.
Bajura, M., H. Fuchs, et al. (July 1992). Merging Virtual Objects with the Real World: Seeing Ultrasound Imagery within the Patient. Proc. ACM SIGGRAPH '92, Computer Graphics 26 (2): 203-210.
The paper describes live ultrasound echography visualization (via HMD) within a pregnant human subject. 2D slices of the ultrasound data are actually shown frozen in space. Ultrasound is used because it is the best real-time imaging technology, but several problems are noted: 1) low signal:noise; 2) poor spatial resolution; 3) "speckle" from tissue sound interference. Initial tests of the systems were done on dolls in water tanks. The ultrasound scanner is fitted with a Polhemus tracker. An initial calibration step is necessary before the system can be used (a calibration "jig"). Technical problems include: 1) conflicting visual cues (can't see data "inside" the patient; a virtual hole can obscure the patient) 2) System lag 3) tracking range & stability 4) HMD resolution 5) display hardware.
Barfield, W., G. Salvendy, et al. (1989). An Analogue and Propositional Hybrid Model for the Perception of Computer Generated Graphical Images. Behavior and Information Technology 8(4): 257-272.
an updated Shepard-Metzler mental rotation experiment
Barfield, W., J. Sanford, et al. (1988). The Mental Rotation and Perceived Realism of Computer-Generated Three-Dimensional Images. International Journal of Man-Machine Studies 29(6): 669-684.
Baudel, T. and M. Beaudouin-Lafon (1993). Charade: Remote Control of Objects Using Hand Gestures. Communications of the ACM 36(7): 28-35.
Talks about a gesture recognition system for giving computer-driven presentations. The gestures are fairly few and number and generally not that complicated-- they "win" by splitting the gestures into three phases: start position, dynamics, end position. The dynamics are typically a direction of motion and / or finger bending. Makes a good point about using tension (a la Buxton) to phrase the gestures: start in a tense state (you're explicitly indicating that you want to do something) and end in a relaxed state (you naturally relax your arm as a gesture comes to a close).
Beaten, R. J., R. J. DeHoff, et al. (January 1987). An Evaluation of Input Devices for 3-D Computer Display Workstations. Proc. of SPIE-The International Society for Optical Engineering 761: 237-244.
Describes a user study (16 subjecs) testing a 3D positioning task using 3D trackball (free space movements), mouse (three buttons used as mode control for motion in the three orthogonal planes), and a custom thumbwheel device (three wheels, one-handed control, arranged to correspond to orientation of display's coordinate system). Output strategies were: perspective encoding of depth and field-sequential stereoscopic encoding of depth. Thumbwheels yielded a more than two-fold increase in positioning accuracy as compared to the other devices. The stereoscopic display reduced positioning error by about 60%. Also, the relative differences between input devices varied across the display conditions, but in general positioning accuracy increased 51-60% with the sterescopic display. Positioning time: The time associated with the mouse was longer than the other two devices. Positioning with either the trackball or the thumbwheels was about 23% faster.
Becker, S. C., W. A. Barett, et al. (April 1991). Interactive Measurement of Three-Dimensional Objects Using a Depth Buffer and Linear Probe. ACM Transactions on Graphics 10(2): 200-207.
Interesting graphics hack to perform 3D measurements based only on the zbuffer information. Application they show is for mensuration of a skull data set.
Bergman, L., H. Fuchs, et al. Image Rendering by Adaptive Refinement. Computer Graphics (Proc. ACM SIGGRAPH '86) 20(4): 29-37.
Bier, E. A. Snap-Dragging In Three Dimensions. Computer Graphics (Proc. 1990 Symposium on Interactive 3D Graphics) 24(2): 193-204.
Bier, E. A. (October 1986). Skitters and Jacks: Interactive 3D Positioning Tools. Proc. 1986 ACM Workshop on Interactive 3D Graphics, Chapel Hill, NC: 183-196.
Describes an early version of Bier's "Gargoyle 3D" system. The interactive techniques are primarily geared towards scene composition, including precise placement of objects using affine transforms. Anchors: A "hot spot" used, for example, to select an axis of rotation. End conditions: e. g., the number of degrees to rotate. Jacks: cartesian coordinate frames used to describe anchors & end conditions. Skitter: 3D cursor (interactively positioned Jack). Uses a gravity function for effective 3D point selection.
Bier, E. A. and M. C. Stone (1986). Snap-Dragging. Computer Graphics (Proc. ACM SIGGRAPH '86) 20(4): 233-240.
describes an updated version of Bier's skitters and jacks technique.
Bier, E. A., M. C. Stone, et al. (Aug. 1993). Toolglass and Magic Lenses: The See-Through Interface. SIGGRAPH '93, Computer Graphics: 73-80.
Two-handed interaction: The user moves a transparent tool sheet using a trackball with their non-dominant hand, and can "click through" the sheet using the mouse in their dominant hand.
Bill, J. R. and S. Lodha Sculpting Polygonal Models using Virtual Tools. Graphics Interface '95.
Blanchard, C., S. Burgess, et al. Reality Built for Two: A Virtual Reality Tool. .
don't currently have pub information for this; it's a 2 page description that I think appeared in SIGGRAPH or perhaps one of the "Symposium on Interactive 3D Graphics" series
Blinn, J. (July 1995). How to Attend a Siggraph Conference. IEEE CG&A.
Bolt, R. A. (August 1980). Put-That-There: Voice and Gesture at the Graphics Interface. Computer Graphics, New York, ACM.
speech+gesture input: Discusses early research on multimodal computer input by the group that was the precursor to the MIT media lab.
Bolt, R. A. and E. Herranz (1992). Two-Handed Gesture in Multi-Modal Natural Dialog. Proc. ACM SIGGRAPH Symposium on User Interface Software and Technology: 7-13.
Borenstein, N. S. (1991). Programming as if People Mattered: Friendly Programs, Software Engineering, and Other Noble Dilusions, Princeton University Press, 41 William Street, Princeton, NJ 08540.
Boritz, J., K. S. Booth, et al. (1991). Fitts' law studies of directional mouse movement. Proceedings of Graphics Interface '91, Toronto: Canadian Information Processing Society.: 216-223.
Bos, E. (1992). Some Virtues and Limitations of Action Inferring Interfaces. Proc. ACM SIGGRAPH Symposium on User Interface Software and Technology: 79-88.
A system that records how the user is manipulating files and tries to infer patterns. For example, if you keep dragging files to the same folder it will guess that you want to move all the files to the folder.
Brett, C., S. Pieper, et al. (Oct. 1987). Putting It All Together: An Integrated Package for Viewing and Editing 3D Microworlds. Fourth Usenix Computer Graphics Workshop: 2-12.
Britton, E., J. Lipscomb, et al. (1978). Making Nested Rotations Convenient for the User. Computer Graphics 12(3): 222-227.
Brooks, F. P. Grasping Reality Through Illusion: Interactive Graphics Serving Science. Proc. ACM CHI'88 Human Factors in Computing Systems Conference: 1-11.
A very good paper with many useful insights on varying topics in 3D interaction / virtual reality. Includes his "shells-of-certainty" model for user interface research.
Brooks, F. P. J. (1975). . The Mythical Man-month: Essays on Software Engineering, Addison-Wesley Publishing Co. Inc.
Brooks, F. P. J. (October 1986). Walkthrough--a Dynamic Graphics System for Simulating Virtual Buildings. Proc. 1986 ACM Workshop on Interactive 3D Graphics. Chapel Hill, NC: 9-21.
Brunn, A., K. Lay, et al. (1987). An Interactive 3D-Graphics User Interface for Engineering Design. Proc. IFIP INTERACT'87: Human-Computer Interaction: 677-682.
Describes a mouse and keyboard based interface to a CAD system.
Bryson, S. (May 1996). Virtual Reality in Scientific Visualization. Communications of the ACM 39(5): 62-71.
Bryson, S. and C. Levit (July 1992). The Virtual Wind Tunnel. IEEE Computer Graphics & Applications.
Describes an interface which allows the user to look into a pre-computed volume using a boom display and a glove to interact with the data.
Butterworth, J., A. Davidson, et al. 3DM: A Three Dimensional Modeler Using a Head-mounted Display. Proc. 1992 Symposium on Interactive 3D Graphics.
Describes a 3D CAD system for use in a HMD. Has support for multiple navigation models: User "growing" and "shrinking" to allow work at multiple levels of detail; Walking (only within tracker range); Flying; Grabbing the world (dragging & rotating). "Since the user can become disoriented by all of these methods of movement, there is a command that immediately returns the user to the initial viewpoint in the middle of the modeling space." Uses rubber banding and predictive highlighting (e.g. gravity and plane/grid snapping) to aid in object selection. Simultaneous translation and rotation is helpful because it "concentrates more functionality into each operation" (thus saving time by requiring fewer total operations).
Buxton, B. Integrating the Periphery and Context: A New Model of Telematics. Graphics Interface '95: 239-246.
Discusses foreground/background processing for human-human and human-computer interaciton and context-sensitive interaction.
Buxton, B. Living in Augmented Reality: Ubiquitous Media and Reactive Environments. .
Manuscript in preparation.
This paper talks ubiquitous computing, ubiquitous video, and proximal sensing, citing lots of interesting examples.
Buxton, B. (Monday, June 12, 1995). If you have a problem with a computer, is it really your problem? The Globe and Mail, Toronto, Ontario, Canada.
Buxton, W. Human skills in interface design. Interacting With Virtual Environments. L. MacDonald and J. Vince. New York, Wiley: 1-12.
Proposes a "three mirrors" design model. Technology reflects human capabilities (physical, cognitive, and social) and should be evaluated by how well these reflections match the extent of human ability.
Buxton, W. The Pragmatics of Haptic Input. CHI'90 Tutorial 26 Notes.
Contains a comprehensive list of input devices and vendors.
Buxton, W. University of Toronto Input Research Group (IRG). .
Overview + publications list
Buxton, W. (1983). Lexical and Pragmatic Considerations of Input Structure. Computer Graphics 17(1): 31-37.
Buxton, W. (1986). There's More to Interaction Than Meets the Eye. User Centered System Design: New Perspectives on Human-Computer Interaction. D. Norman and S. Draper. Hillsdale, N.J., : 319-337.
Buxton, W. (1986
475-480). Chunking and Phrasing and the Design of Human-computer Dialogues. Information Processing '86, Proc. of the IFIP 10th World Computer Congress. H. J. Kugler. Amsterdam.
Describes how muscular tension can be used to phrase elements of dialogs together (example: pop-up menu: holding mouse button down phrases bringing up the menu and menu item selection).
Buxton, W. (1995). Speech, Language and Audition. Readings in Human-Computer Interaction: Toward the Year 2000. R. Baecker, J. Grudin, W. Buxton and S. Greenberg, Morgan Kaufmann Publishers.
Chapter 8. Great overview of the important inssues in speech and natural language interfaces, speech synthesis, speaker recognition, natural language recognition and generation, multimodal interaction, nonspeech audio, and the logistics of using sound on computers, with excellent references.
Buxton, W. (1995). Touch, Gesture, and Marking. Readings in Human-Computer Interaction: Toward the Year 2000. R. Baecker, J. Grudin, W. Buxton and S. Greenberg, Morgan Kaufmann Publishers.
Chapter 7. An excellent overview including device capabilities, taxonomy of input devices, chunking and phrasing, marking, gestures, and two handed input. Lots of good references to key papers in the area.
Buxton, W. (Thursday, Aug. 31, 1995). Is it Windows 95 or 85? Teasing reality from the hype. The Globe And Mail, Toronto, Ontario, Canada.
Buxton, W., E. Fiume, et al. Continuous hand-gesture driven input. Proceedings of Graphics Interface '83: 191-195.
Describes a sketch editor; all commands are implemented using simple gestures in combination with 1 button on the puck. The implications of well-chosen gestures are listed. The disparate gestures for each part of the syntax are designed to produce smooth, continuous motions. Most frequent commands are given the simplest gestures, i.e. sketching.
Buxton, W., R. Hill, et al. (Jul, 1985). Issues and Techniques in Touch-Sensitive Tablet Input. Computer Graphics 19(3): 215-224.
A good discussion of the issues involved in touch-sensitive input devices. The technology level described in the paper has probably advanced considerably since 1985, though. Main advantages noted: * No mechanical intermediate device (direct hand contact) * the tracking symbol "stays put" once placed -- you don't have to worry about "bumping the mouse" * No mechanical / kinesthetic restrictions on the ability to indicate more than one point at once. * Low profile, allowing easy integration into desktops / other equipment * one piece construction -- dirt doesn't get in the cracks * no moving parts -- reliable Perhaps the biggest combined plus is that they can save screen real estate, and in combination with a physical template, can be operated by the user while attention is focused on the screen. One general comment is that it is preferable to implement valuators that sense *change* in position as opposed to *absolute* position. The chief difficulty is probably that there is no direct physical or visual feedback (e.g., when positioning a valuator) provided by the tablet surface.
Buxton, W. and B. Myers (1986). A Study in Two-Handed Input. Proc. ACM CHI'86 Conference on Human Factors in Computing Systems,: 321-326.
A classic user study which showed that users can improve performance by using two hands to operate an interface, without necessarily experiencing any cognitive load.
Card, S., J. Mackinlay, et al. (1990). The Design Space of Input Devices. Proc. ACM CHI'90 Conference on Human Factors in Computing Systems: 117-124.
Card, S. K., J. D. Mackinlay, et al. (April 1991). A Morphological Analysis of the Design Space of Input Devices. ACM Transactions on Information Systems 9(2): 99-122.
Card, S. K., G. G. Robertson, et al. (1991). . Proc. ACM CHI'91 Conference on Human Factors in Computing Systems: 181-187.
Caudell, T. P. and D. W. Mizell Augmented Reality: An application of heads-up display technology to manual manufacturing processes. . Proc. HICCS '92.
Chatty, S. Extending a Graphical Toolkit for Two-Handed Interaction. ACM UIST'94 Symp. on User Interface Software & Technology: 195-204.
Chatty, S. Issues and Experience in Designing Two-handed Interaction. CHI'94 Conference Companion.
Also available at http://www.cenatls.cena.dgac.fr/English/pii/Chatty.html
Chen, M., S. J. Mountford, et al. (August 1988). A Study in Interactive 3-D Rotation Using 2-D Control Devices. Computer Graphics 22(4): 121-129.
Chen studies four methods for using 2D input to rotate 3D objects: 1. Graphical sliders: A simple arrangement of horizontal sliders, one each for x, y, and z rotations. 2. Overlapping sliders: Uses vertical/horizontal mouse movement to control x and y rotations, while circular movement means z rotation. 3. Continuous XY + Z: 4. Virtual Sphere: Chen's user study indicated that the Virtual Sphere technique achieved the best results. He also compared the Virtual Sphere with a similar technique developed by Evans et al. [Evans 81]; no significant difference was found in mean time to complete simple or complex rotations, but users preferred the Virtual Sphere controller. The paper includes an appendix which describes the implementation of the virtual sphere in detail.
Chung, J. C. A comparison of Head-tracked and Non-head-tracked Steering Modes in the Targeting of Radiotherapy Treatment Beams . Proc. 1992 Symposium on Interactive 3D Graphics: 193-196 (color plate p. 232).
This study compares four head-tracked and three non-head-tracked modes for changing position and orientation in the virtual world. Taken as a whole, head-tracked and non-head-tracked modes "differed very little". The test model was an abstract model consisting of colored spheres and a central target region. The user tried to find the best beam path to the target, which was defined as the beam path with minimum intersection of the beam and the spheres. All interaction modes were displayed on a HMD. (N=14 subjects)
Chung, J. C. (1991). Application of Head-Mounted Display to Radiotherapy Treatment Planning. Proceedings of ACM CHI'91 Conference on Human Factors in Computing Systems : 489.
This is just a paragraph describing Mr. Chung's research.
Chung, J. C., M. R. Harris, et al. (January 15-20, 1989). Exploring Virtual Worlds with Head-Mounted Displays, Non-Holographic True 3-Dimensional Display Technologies. SPIE Proceedings, 1083.
Co., S. S. .
Motion detectors (linear motion potientiometers, joysticks, etc.) & robotic components
Cohen, D. and A. Kaufman Scan-Conversion Algorithms for Linear and Quadratic Objects. Volume Visualization: 280-301.
Cohen, M., J. Painter, et al. Volume Seedlings. Computer Graphics (Proc. 1992 Symposium on Interactive 3D Graphics).
Cohen, P. R. (1992). The Role of Natural Language in a Multimodal Interface. ACM UIST'92 Symp. on User Interface Software & Technology: 143-149.
Cohen, P. R. and J. W. Sullivan (1989). Synergistic Use of Direct Manipulation and Natural Language. Proc. ACM CHI'89 Conference on Human Factors in Computing Systems: 227-233.
Conner, D., S. Snibbe, et al. Three-Dimensional Widgets. Computer Graphics (Proc. 1992 Symposium on Interactive 3D Graphics).
Conway, M., R. Pausch, et al. (1994 ). Alice: A Rapid Prototyping System for Building Virtual Environments . Proceedings of ACM CHI'94 Conference on Human Factors in Computing Systems . 2 : 295.
Corporation, A. T. .
Bird input device
Corporation, T. .
Ribbon & area switches
Cremer, M. and R. Ashton (1981). Motor performance and concurrent cognitive tasks. . Journal of Motor Behavior 13: 187-196.
Experiment measures speed of tapping and consistency during concurrent verbal or visuospatial tasks. "Concurrent speech activity will selectively disrupt right-hand but not left-hand performance." -- speaking rhyme interferes with right hand performance -- visuospatial task interferes with left hand performance
Cruz-Neira, C., D. Sandin, et al. (Aug. 1993). Surround-Screen Projection-Based Virtual Reality: The Design and Implementation of the CAVE. Computer Graphics (SIGGRAPH Proceedings).
Describes a VR system based on projection of images onto walls which surround the user.
Darken, R. P. Wayfinding Strategies and Behaviors in Large Virtual Worlds. Proceedings of ACM CHI'96 Conference on Human Factors in Computing Systems.
Darken, R. P. and J. L. Sibert (1993). A Toolset for Navigation in Virtual Environments. Proc. ACM User Interface Software & Technology: 157-165.
Darken, R. P. and J. L. Sibert (Oct. 1995). Navigating Large Virtual Spaces. International Journal of Human-Computer Interaction.
Das, H., T. B. Sheridan, et al. (1989). Kinematic Control and Visual Display of Redundant Teleoperators. IEEE Systems Man and Cybernetics.
Deering, M. (July 1992). High Resolution Virtual Reality. Computer Graphics 26(2): 195-202.
Talks about a desktop VR system which allows the user to work with a 3D tracker in a volume stereoscopically projected in front of the monitor. Good description of the math for head tracking. Also talks about taking into account the user's actual eye and distortions caused by the monitor glass.
Deering, M. (May 1996). The HoloSketch VR Sketching System. Communications of the ACM 39(5): 54-61.
DeLine, R. (May, 1993.). Alice: A Rapid Prototyping System for Three-Dimensional Interactive Graphical Environments, University of Virginia.
Dillon, R. F., J. D. Eday, et al. (1990). Measuring the True Cost of Command Selection: Techniques and Results. Proc. of ACM CHI'90 Conference on Human Factors in Computing Systems.
Purpose: 1) develop paradigm for estimates of command selection time ("subtraction technique" for measuring multimodal selection cost); 2) explore different selection methods. Input methods analyzed include: 1) mouse (used to draw & select commands), 2) voice commands, 3) touch (touchscreen, nonpreferred hand), 4) mouse (nonpreferred hand, small menu items), 5) mouse (nonpreferred hand, large menu items). In items 2-5, the preferred hand held a mouse used exclusively for drawing. Practice was allowed to achieve maximum speed and accuracy. The voice and touch methods were fastest. The various mouse-based options were essentially equivalent. The least errors were observed with voice, touch, and the large-menu mouse. Other conclusions: Non-preferred hand can be used as well as preferred hand for command selection. Voice advantage over touch: No attention to menu is necessary if user can remember the command names.
Driver, J., R. Read, et al. (August, 1990). .
Computer Science Department, University of Texas at Austin, Available as Technical Report TR-90-29. Describes a user study with a small number of subjects.
Edin, B. B., R. Howe, et al. (March/April 1993). A Physiological Method for Relaying Frictional Information to a Human Teleoperator. IEEE Trans. on Systems, Man, and Cybernetics 23(2): 427-432.
Edwards, B. .
Hand-out from her talk at CHI'96. "The Five Basic Perceptual Skills of Drawing"
Emmerik, M. J. G. M. v. (1990). A Direct Manipulation Technique for Specifying 3D Object Transformation with a 2D Input device. Computer Graphics Forum 9: 335-361.
English, E. (Feb. 1995). Touch-screen technology takes off. IEEE Computer.
current touchscreen market: 150-200 million, predicts 25% annual increase; fastest increase in info kiosks and point-of-sale. Note on "ThruGlass" technology -> detects touch through up to 2 inches of glass, wood, plexiglass (nonconductive materials).
Evans, K. B., P. P. Tanner, et al. (August 1981). Tablet-based Valuators that Provide One, Two, or Three Degrees of Freedom. Computer Graphics 15(3): 91-97.
Describes various ways of mapping stylus motion to valuators. One of his 3DoF techniques is similar to the Virtual Sphere; Chen compares it to the Virtual Sphere in his paper [Chen 1988]. Evans also discusses an automatic vernier motion (fine positioning) technique.
F. P. Brooks, J., M. Ouh-Young, et al. (1990). Project GROPE--Haptic Displays for Scientific Visualization. Computer Graphics (Proc. ACM SIGGRAPH '90) 24(4): 177-185.
Describes long-term research effort into haptic ("pertaining to sensations such as touch, temperature, pressure, etc. mediated by skin, muscle, tendon, or joint") displays for molecular docking. Interesting as an example of how to develop a system for real users. Haptic displays are of limited application, but when they are applicable, a performance increase of approximately 2x is measured over pure visual stimuli. Some interesting results on 3D/6D manipulation: * Users of an imperfect-perception visual system tend to decompose three-dimensional positioning tasks into several separate subtasks, each of lower dimensionality * Even in real space, subjects usually decompose 6D docking tasks into 3D positionings alternating with 3D rotations. More than 2D motions are rarely observed in virtual space.
Fahlen, L. E., C. G. Brown, et al. (1993). A Space Based Model for User Interaction in Shared Synthetic Enviornments. Proc. ACM INTERCHI'93 Conference on Human Factors in Computing Systems.
Talks about using the "aura" around a user as a way to support multi-user interaction in VR. An example is using proximity to "enable" use of a whiteboard tool.
Feiner, S., B. MacIntyre, et al. Windows on the World: 2D Windows for 3D Augmented Reality. ACM UIST'93 Symp. on User Interface Software & Technology: 145-155 .
Feiner, S., B. Macintyre, et al. (1993). Knowlege-Based Augmented Reality. Communications of the ACM 36(7): 53-61.
Describes a system which employs a see-through head mounted display (augmented reality) and projects wireframe graphics onto objects in the real world. An example given is an application which overlays a laser printer with wireframe information to help the user perform maintenance tasks. The head mount is constructed using a Private Eye.
Feiner, S. and A. Shamash (November 11--13, 1991). Hybrid User Interfaces: Breeding Virtually Bigger Interfaces for Physically Smaller Computers. Proc. UIST '91 (ACM Symp. on User Interface Software and Technology). Hilton Head, SC: 9-17.
Fisher, S. S., M. McGreevy, et al. (October 1986). Virtual Environment Display System. Proc. 1986 ACM Workshop on Interactive 3D Graphics. Chapel Hill, NC: 77-87.
an excellent piece of early virtual reality research. NASA Telepresence research. Not mentioned in the text, but clearly the authors envisioned two-handed manipulation (along with voice input and 3D localized sound): see figure, p. 84.
Fisher, S. S., M. McGreevy, et al. (October, 1988). Virtual Interface Environment for Telepresence Applications. Proceedings of the Human Factors Society 32nd Annual Meeting.
NASA Telepresence research
Fitts, P. (1954). The information capacity of the human motor system in controlling the amplitude of movement. Journal of Experimental Psychology 47: 381-391.
Fitts, P. M., & and J. R. Peterson ((1964). ). Information capacity of discrete motor responses. . Journal of Experimental Psychology, 67,: 103-112.
Fitzmaurice, G., H. Ishii, et al. (1995). Bricks: Laying the Foundations for Graspable User Interfaces. Proc. ACM CHI'95 Conference on Human Factors in Computing Systems: 442-449.
Fitzmaurice, G. W. (1993). Situated Information Spaces and Spatially Aware Palmtop Computers. Communications of the ACM 36(7): 39-49.
Describes a hand-held monitor tracked in free space to view an imaginary 3D scene that surrounds the user. A button on top of the monitor is used to select commands.
Foley, J. D., A. van Dam, et al. (1990.). . Fundamentals of Interactive Computer Graphics. Reading, MA.
Foley, J. D., V. Wallace, et al. (Nov. 1984). The Human Factors of Computer Graphics Interaction Techniques. IEEE Computer Graphics and Applications.
Forrest, A. R. (October 1986). User Interfaces for Three-Dimensional Geometric Modeling. Proc. 1986 ACM Workshop on Interactive 3D Graphics. Chapel Hill, NC: 237-249.
Fraser, C. and J. de Fusco (Aug 1981). A Standardised Test of Hand Function. British Journal of Occupational Therapy.
(A new version of this test is in preparation.) A standardised test of hand function that involves picking up, screwing and unscrewing nuts and bolts in both unilateral and bilateral conditions. The test is timed and has norms for males and females in the age ranges of 20 to 29, 30 to 39, 40 to 49, 50 to 59, 60 to 69 and 70 to 79. The test is used in Occupational Therapy departments to measure patient's performance after stroke and hand injury. It is available from Nottingham Rehab, Nottingham, NG2 6HD, UK. Tel ++ 0115 945 2345/ fax ++ 0115 945 2124.
Fuchs, H., M. Levoy, et al. (August 1989). Interactive Visualization of 3D Medical Data. Computer 22(8): 46-51.
A very good overview paper. Discusses issues in medical data visualization, including rendering techniques, display hardware, and future research.
Galyean, T. A. and J. F. Hughes (July 1991). Sculpting: An Interactive Volumentric Modeling Technique. Computer Graphics (Proc. ACM SIGGRAPH '91) 25(4): 267-274.
Describes a desktop polhemus-based system which allows the user to interactively deform a volumetric model. Uses a "poor man's" force feedback system using bungee cords to attempt some degree of haptic feedback. User works in a fixed volume in front of the display.
Gibson, J. (1986). The Ecological Approach to Visual Perception. .
Glassner, A. S. (1990). A Two-Dimensional View Controller. ACM Transactions on Graphics 9(1): 138-141.
Gleicher, M. (1993). A Graphics Toolkit Based on Differential Constraints. Proc. ACM Symposium on User Interface Software and Technology: 109-120.
a very readable, very interesting discussion of Gleichers Bramble toolkit.
Gleicher, M. and A. Witkin Drawing with Constraints. The Visual Computer.
Abstract from draft version: "The success of constraint-based approaches to drawing has been limited by difficulty in creating constraints, solving them, and presenting them to users. In this paper, we discuss techniques used in the Briar drawing program to address all of these issues. Briar's approach separates the problem of initially establishing constraints from that of mainting them during subsequent editing. We describe how non-constraint-based drawing tools can be augmented to specify constriants in addition to positions. These constraints are then maintained as the user drags the model, allowing the user to exlore configurations consistent with the constraints. Visual methods are provided for displaying and editing the constraints."
Gleicher, M. and A. Witkin (1993). Supporting Numerical Computations in Interactive Contexts. Graphics Interface '93.
describes Snap-Together Mathematics -- an approach to dynamically compose sytems of equations and rapidly evaluate them and their derivatives.
Gleicher, M. and A. Witkin (July 1992). Through-the-Lens Camera Control. Computer Graphics (Proc. ACM SIGGRAPH '92) 26 (2): 331-340.
Goble, J. C., K. Hinckley, et al. (July 1995). Two-handed Spatial Interface Tools for Neurosurgical Planning. IEEE Computer.
Gossweiler, R., C. Long, et al. (October, 1993). DIVER: A Distributed Virtual Environment Research Platform. IEEE Symposium on Research Frontiers in Virtual Reality.
Green, M. (May 1996). A Geometric Modelling and Animation System for Virtual Reality. Communications of the ACM 39(5): 46-53.
Green, M. and C. D. Shaw (1990). The DataPaper: Living in the Virtual World. Proc. Graphics Interface '90: 123-130.
Group, U. U. I. (May 1995). Alice: Rapid Prototyping for Virtual Reality. IEEE Computer Graphics & Applications.
Randy Pausch: An early description of Alice.
Guiard, Y. (1987). Asymmetric Division of Labor in Human Skilled Bimanual Action: The Kinematic Chain as a Model. The Journal of Motor Behavior 19(4): 486-517.
great reference -- analysis of how right-handed people coordinate two-handed motions in skilled tasks. Guiard proposes three high order principles: 1) "motion of the right hand typically finds its spatial references in the results of motion of the left hand"; 2) The right and left hands are involved in asymmetric temporal-spatial scales of motion (right hand for high frequency, left hand for low frequency); 3) "the contribution of the left hand to global bimanual performance starts earlier than that of the right." Guiard hypothesizes that these principles can be accounted for by modelling the hands as a pair of abstract motors in a serial assemblage.
Guiard, Y. (1988). The Kinematic Chain as a Model for Human Asymmetrical Bimanual Cooperation. Cognition and action in skilled behavior. A. Colley and J. Beech, Amsterdam: North-Holland: 205-228.
Guiard, Y. ((in press)). The Distal-to-Proximal Increase of Link Length along the Human Kinematic Chain: An Exponential Progression of Workspace Extension. Annales de Sciences Naturelles.
Guiard, Y. (Spring 1989). Failure to Sing the Left-Hand Part of the Score during Piano Performance: Loss of the Pitch and Stroop Vocalizations. Music Perception 6(3): 299-314.
Guiard, Y., G. Diaz, et al. (1983). Left-hand advantage in right-handers for spatial constant error: Preliminary evidence in a unimanual ballistic aimed movement. . Neuropsychologia 21: 111-115.
A brief note describing an experiment where "right-handers perfom open-loop ballistic aimed movements with a smaller constent error when using the left hand."
Guiard, Y. and M. F (1984). Writing Postures in Left-Handers: Inverters are Hand-Crossers. Neuropsychologia 22(5): 535-538.
Guiard, Y. and T. Ferrand ((In press as of Sept 21, 1995)). Asymmetry in Bimanual Skills. Manual asymmetries in motor performance. D. Elliott and E. A. Roy, Boca Raton, FL: CRC Press.
Halpern, D. F. (1992). . Sex differences in cognitive ability: Lawrence Erlbaum Associates, Inc.
(Chapter 5: sexual dimorphism in hemispheric specialization). The question of laterality is closely tied to hemispheric specialization, and thus, by some theories, is also tied to sex differences. Overall trends from many studies are that females perform better on some dexterity tasks (Chapter 3), and that males perform better on some spatial visualization tasks such as mental rotation. Handedness and Reasoning ability are important moderating variables.
Hand, C. .
an exellent resource for VR and 3D interaction information on the web. Includes pointers to many on-line bibliographies.
Hauptmann, A. G. (1989). Speech and Gestures for Graphic Image Manipulation. Proc. ACM CHI'89 Conference on Human Factors in Computing Systems: 241-245.
Describes a wizard-of-oz experiment in which test users attempted to perform three-dimensional manipulations using speech, or gestures, or both. Provides a characterization of the types of gestures that users will use spontaneously.
Heckbert, P. (Nov. 1986). Survey of Texture Mapping. IEEE Computer Graphics and Applications 6 (11): 56-67.
Herndon, K. and T. Meyer 3D Widgets for Exploratory Scientific Visualization. ACM UIST'94 Symp. on User Interface Software & Technology.
Discusses 3D widgets for exploring computational fluid dynamics datasets, including probe, rake, and hedgehog widgets.
Herndon, K., A. van Dam, et al. (Oct. 1994). The Challenges of 3D Interaction: A CHI'94 Workshop. SIGCHI Bulletin 26(4): 36-43.
Summarizes discussions held at the CHI'94 Workshop on 3D interaction. Covers a wide range of topics, including applications of 3D graphics, psychology and perception issues, state of the art work, and future research directions. Includes an excellent bibliography.
Herndon, K., R. Zeleznik, et al. (1992). Interactive Shadows. Proc. ACM SIGGRAPH Symposium on User Interface Software and Technology: 1-6.
Herot, C. and G. Weinzapfel (1978). One Point Touch Input of Vector Information from Computer Displays. Computer Graphics 12(3): 210-216.
Haptic Displays: Paper is about a screen which cannot only sense touch but also force and direction.
Hill, R. D. (April 1987). Adaptive 2-D Rotation Control. ACM Transactions on Graphics 6(2): 159-161.
To rotate objects rapidly and precisely to multiples of 90 degrees, yet allow accurate selection of arbitrary rotations.
Hinckley, K., M. Conway, et al. Revisiting Haptic Issues for Virtual Manipulation. .
Position statement for CHI'96 Workshop on Manipulation in Virtual Environments
Hinckley, K., J. Goble, et al. (1995). New Applications for the Touchscreen in 2D and 3D Medical Imaging Workstations. SPIE Conference on Medical Imaging.
Hinckley, K., R. Pausch, et al. (1994). Passive Real-World Interface Props for Neurosurgical Visualization. Proceedings of ACM CHI'94 Conference on Human Factors in Computing Systems. 1: 452-458.
describes a desktop two-handed spatial interface which uses tools or "props" to allow neurosurgeons to visualize 3D MRI data.
Hinckley, K., R. Pausch, et al. (1994). A Survey of Design Issues in Spatial Input. ACM UIST'94 Symp. on User Interface Software & Technology: 213-222.
A literature survey and synthesis of design issues in 3D interfaces.
Hinckley, K., R. Pausch, et al. (1994). A Three-Dimensional User Interface for Neurosurgical Visualization. Proc. of the SPIE Conference on Medical Imaging: 126-136.
a description of the props interface intended for the medial imaging audience.
Hinckley, K. and M. O. Ward The Visual Comparison of Three Sequences. IEEE Visualization '91.
Hohne, K., M. Bomans, et al. (July 1992). A Volume-based Anatomical Atlas. IEEE Computer Graphics and Applications.
The authors describe a volume-based anatomical atlas of the human head (based on data from real patients). The ability to specify cutting planes is included to facilitate exploration of the volume. The authors claim that it is more natural for anatomists to peel away layers of related tissues than to just arbitrarily slice through the data. Thus, the cutting plane should penetrate only selected tissues, leaving the other "deeper" tissues untouched.
Honda, H. (1982). Rightward superiority of eye movements in a bimanual aiming task. . Quarterly Journal of Experimental Psychology 34A: 499-513.
For easy task, frequently no visual monitoring at all. For difficult task, large preference for rightward eye movements. Nonpreferred hand can be better for tasks that don't require visual monitoring.
Honda, H. (1984 ). Functional between-hand differences and outflow eye position information. . Quarterly Journal of Experimental Psychology, 36A: 75-88.
Hoppe, H., T. DeRose, et al. (August 1993). Mesh Optimization. Computer Graphics -- Proc. SIGGRAPH '93.
A polygon reduction algorithm that optimally (in a least-squares sense) conserves the original triangle mesh topology. Appears to behave better than the "Polygonal Decimation" algorithm.
Hopper, A., A. Harter, et al. (1993). The Active Badge System. INTERCHI'93 Conference on Human Factors in Computing Systems: 474-481.
Brief summary of active badges set-up
Horton, W. (Nov. 1995). Top Ten Blunders by Visual Designers. Computer Graphics.
A decent overview of things to avoid.
Houde, S. Iterative Design of an Interface for Easy 3-D Direct Manipulation. Proc. ACM CHI'92 Human Factors in Computing Systems Conference: 135-142.
Handles on object for 3D manipulation; hand-shaped cursors suggest type of manipulation being performed. Must switch modes when going between translations and rotations.
Inc, T. S. .
Touch screen reseller. Installs the physical screen for you and provides driver software (including various UNIX / X-Windows drivers). Touchscreens from basically every manufacturer available.
Inc., B. S. C. .
Ultra Miniature Micro Switches
Inc., I. T. .
Computer peripherals for Digital Input / Output, Analog to Digitial (A/D) or Digital to Analog (D/A) conversion.
Incorporated, D. I. D. .
Markets a couple of interesting 3D input devices: the Cricket 3D Interaction Tool: held with pistol-grip orientation in front of screen. Provides operating buttons as well as "vibrating buttons in the palm area to proved the user with tactile feedback." Can be fitted with a variety of 3D input sensors. The Monkey: a human posture input device (instrumented armature of a human figure). 170 Claremont Ave New York, NY 10027 212-222-5236
Ingram, D. (1975). Motor asymmetries in young children. . Neuropsychologia 13: 95-101.
Study shows that 3, 4, & 5 year old [right-handed] children perform better on strength and tapping tasks with the right hand, but better on hand posture and finger spacing tasks with the left hand.
Interlink Electronics 500 Flynn Road, C., CA 93012 (805) 484 8855 .
Force Sensing Resistors (packaging is similar to membrane switch). Linear potentiometers, X/Y + Force touchpad also available.
Iwata, H. Artificial Reality with Force-feedback: Development of Desktop Virtual Space with Compact Master Manipulator . Computer Graphics 24(4): 165-170.
Jacob, R. J. K. and L. E. Sibert (1992). The Perceptual Structure of Multidimensional Input Device Selection. Proc. ACM CHI'92 Human Factors in Computing Systems Conference.
This study addresses the question: "What is a three-dimensional tracker good for?" The authors hypothesize that "the structure of the perceptual space of an interaction task should mirror that of the control space of its input device." Thus, a 3D tracker would be good for a task which involves the selection of three related ("integral") dimensions, but would be less effective for unrelated ("separable") dimensions. The study had users perform two interaction tasks with both a Polhemus and a mouse. One task involved setting three integral parameters (x, y location and size of a rectangle), while the other involved separable parameters (x, y location and color of a rectangle). The data collected suggested that matching the integrality/separability of the device to the task yields the best user performance. Neither the Polhemus or the mouse was uniformly superior; each device performed best when it was correctly mapped to "the perceptual structure of the task space". A version of this paper has appeared in the ACM TOCHI journal.
Janin, A. L., D. W. Mizell, et al. Calibration of head-mounted displays for augmented reality applications. . Proc. VRAIS '93.
Johnson, W., L. Klotz, et al. (1993). Bridging the Paper and Electronic Worlds: The Paper User Interface. INTERCHI'93 Conference on Human Factors in Computing Systems: 507-512.
Jones, W. P. and S. Dumais (1986). The Spatial Metaphor for User Interfaces: Experimental Tests of Reference by Location versus Name . ACM Transactions on Office Information Systems 4(1): 42-63.
Kabbash, P. and W. Butxon The "Prince" Technique: Fitts' Law and Selection Using Area Cursors. Proceedings of ACM CHI'95 Conference on Human Factors in Computing Systems: 273-279.
Kabbash, P., W. Buxton, et al. (1994). . CHI'94 Conference on Human Factors in Computing Systems: 417-423.
good empirical study of two-handed principles underlying "tool glass and magic lens" interfaces.
Kabbash, P., I. S. MacKenzie, et al. (1993). . INTERCHI'93 Conference on Human Factors in Computing Systems: 474-481.
Karat, J., J. E. McDonald, et al. (1986). A Comparison of Menu Selection Techniques: Touch Panel, Mouse and Keyboard. International Journal of Man-Machine Studies 25(1): 73-88.
Two studies were conducted to test user performance and attitudes for three types of selection devices used in computer systems. The techniques examined included on-screen direct pointing (touch panel), off-screen pointer manipulation (mouse), and typed identification (keyboard). Both experiments tested subjects on target selection practice tasks, and in typical computer applications using menu selection and keyboard typing. The first experiment examined the performance and preferences of 24 subjects. The second experiment used 48 subjects divided into two typing skill groups and into male-female categories. The studies showed performance advantages for on-screen touch panel entry. Preference ratings for the touch panel and keyboard devices depended on the type of task being performed, while the mouse was always the least preferred device. Differences between this result and those reporting an advantage of mouse selection are discussed.
Karl, L., M. Pettey, et al. (1992). Speech-Activiated versus Mouse-Activated Commands for Word Processing Aplications: An Empirical Evaluation. International Journal of Man-Machine Studies.
Empirical study of voice input in word processing applications. Reduction time using speech about 20%. Most interesting finding: voice has a negative interaction with a simple memorization task. Also has a good summary of the problems with voice input: * need good feedback after a recognition (or failed recognition) * Improvements in response times would decrease errors * Headset is too much bother to put on for short transactions, uncomfortable for lengthy use; A desktop microphone is too unreliable.
Kaufman, A. Volume Visualization. : IEEE Society Press.
Kaufman, A. and R. Yagel (September 1989). Tools for Interaction in Three Dimensions. Proc. 3rd International Conference on HCI (Boston, MA) 1: 468-475.
This paper contains the most comprehensive description of the 3D user interface for Kaufman's CUBE workstation. Cube has viewing windows which employ a "combination look" for object rendering: drawings are superimposed on shaded images to capitalize on the advantages of each type of look. A separate window ("World Space") allows the user to specify the eye point, the direction of projection, the projection surface, the light sources (3), etc. The world view can be merged with the view window on sufficiently fast machines. A "full jack" or a jack with shadows on each wall is used to relate position information. The paper advocates having anchors in each objects to help with positioning; this is mostly useful in geometric objects which have been created in the environment (to define volumes of interest or surgical implants). A gravity mechanism is used to assist motion during object picking and parameter specification.
Kaufman, A., R. Yagel, et al. (1990). Direct Interaction with a 3D Volumetric Environment. Computer Graphics 24(2): 33-34.
This paper briefly (2 pages) describes a 3D user interface for a volume visualization system, "edvol." The work space is presented as 3D rectilinear space in perspective view. A "jack" is used for visual feedback of the kite (Polhemus tracker) position (in 6 DoF). The keyboard is used for approval, mode change, menus, and vernier motion. The Glove/kite is used for orientation and pick/valuator input. The kite can also be used separately from the glove. Some glove gestures can be used as commands. A mouse is used for menu control and 3D input (as a triad mouse). The mouse and glove can be used in combination (the mouse rotates the world while the glove moves an object). User controlled gravity facilitates selection of a point or motion along surfaces / lines.
Kelso, J., D. Southard, et al. (1979). On the Coordination of Two-Handed Movements. Journal of Experimental Psychology: Human Perception and Performance 5(2): 229-238.
studied the hands reaching out to separate targets in parallel. Nothing about coordinated movements, although it does show that the hands would reach out in phase
Kim, W. S., A. Liu, et al. (1988). A Helmet Mounted Display for Telerobotics. COMPCON Spring `88 (IEEE Computer Society). San Francisco, CA.
Kimura, D. (1973). Manual Activity During Speaking- I. Right-Handers and II. Left-Handers. Neuropsychologia 11: 45-55.
Kimura, D. and C. Humphrys (1981). A Comparison of Left- and Right-Arm Movements During Speaking. Neuropsychologia 19(6): 807-812.
Kreig, J. C. Accuracy, Resolution, Latency and Speed; Key Factors in Virtual Reality Tracking Environments. : 11 pages.
available from Polhemus Navigation Sciences, Inc., P. O. Box 560, Colchester, VT 05446. (802) 655-3159.
Krueger, M. W. (1993). Environmental Technology: Making the Real World Virtual. Communications of the ACM 36(7): 36-37.
Krueger, M. W., T. Gionfriddo, et al. (April 1985). VIDEOPLACE--An Artifical Reality. Proc. Of ACM CHI'85: 35-40.
One of the most compelling examples is using both hands to edit a B-spline curve: you can use index finger & thumb of each hand to simultaneously manipulate 4 control points at once. Even though the system is over 10 years old, in many ways it offered much richer interaction than present day technologies.
Laferriere, R., M. Keller, et al. Multi-User Virtual Environments: A Tutorial. .
currently submitted to PRESENCE
Lampson, B. W. (Jan. 1984). Hints for Computer System Design,. IEEE Software.
Lasseter, J. (July 1987). Principles of Traditional Animation Applied to 3D Computer Animation. Computer Graphics (Proc. ACM SIGGRAPH '87) 21(4): 35-44.
Laur, D. and P. Hanrahan (1991). Hierarchical Splatting: A Progressive Refinement Algorithm for Volume Rendering. Computer Graphics (Proc. ACM SIGGRAPH '91) 25(4): 285-288.
Laurel, B. Interface Agents: Metaphors with Character. : 355-365.
Leblanc, A., P. Kalra, et al. (1991). Sculpting with the "Ball and Mouse" Metaphor. Graphics Interface '91: 152-159.
two-handed 3D interface based on orienting object with spaceball in left hand (rotations only) and grabbing it with the mouse
Leganchuk, A., S. Zhai, et al. (1995). Bimanual direct manipulation in area sweeping tasks. . .
Manuscript in preparation. Available at http://www.dgp.toronto.edu/people/andrea/bimanual.html
Levoy, M. (February 1990). Volume Rendering by Adaptive Refinement. Visual Computer 6(1): 2-7.
Describes an adaptive refinement volume rendering algorithm, including all the math. Nicely states the principles of adaptive refinement: 1. Distribute work where it makes the most difference 2. Form intermediate images from partial information 3. Minimize the work discarded after the formation of each image
Levoy, M. (May 1988). Display of Surfaces from Volume Data. IEEE Computer Graphics and Applications.
Liang, J. (Fall 1995). Interaction Techniques for Solid Modeling with a 3D Input Device. .
Liang, J., M. Green, et al. (Aug. 1993). JDCAD: A Highly Interactive 3D Modeling System. : 217-222.
Describes a Polhemus-based CAD system. The user hold the polhemus in front of the monitor and casts rays into the scene, rather than picking directly based on the position of the polhemus. This provides a nice metaphor for working at increased scale -- the user can zoom in on an object to see detail; since everything is done relative to the image on the monitor, a hand motion in real space now results in a small-scale motion in virtual space. A lot of interesting ideas.
Liang, J., C. Shaw, et al. (1991). On Temporal-Spatial Realism in the Virtual Reality Environment. Proc. ACM SIGGRAPH Symposium on User Interface Software and Technology.
This paper discusses filters for improving the latency and jitter of the Polhemus Isotrak. The latency is due to delay in orientation data; jitter is due to noise in the position data. A predictive Kalman filter is used to compensate for the latency, and an anisotropic low-pass filter is used to reduce the position jitter. The results are only applicable to head-mounted trackers, as the dynamic range of arm/hand movements is too great for the filters described. Additional latency is also introduced by the filtering.
Limoges, S., C. Ware, et al. (1989). Displaying Correlations using Position, Motion, Point Size, or Point Colour. Proc. Graphics Interface '89, : 262-265.
Liu, A., L. Stark, et al. (April 1991). Interaction of Visual Depth Cues and Viewing Parameters During Simulation Telemanipulation. 1991 IEEE International Conference on Robotics and Automation. Sacramento, CA: 2286-2291.
User study. Tests the effectiveness of head motion parallax, but the motion was not under user control: the view simply oscillated under machine control. "Our experimental results do not provide srong evidence that relative depth cues affected tasks that required absolute depth information. The object rotation cue did not enhance task performance because it only provided information about the object's three dimensionality. Pseudo-head motion parallax as we implemented it, also did not no enhance performance, but if implemented under operator control, it might prove to be a more effective cue. The frustrum angle decreased completion time but had no effect on error."
Liu, A., G. Tharp, et al. (1992). Depth Cue Interaction in Telepresence and Simulated Telemanipulation . SPIE Conference on Human Vision, Visual Processing, and Digital Display. San Jose, CA.
MacKay, W., G. Velay, et al. (1993). Augmenting Reality: Adding Computational Dimensions to Paper. Communications of the ACM 36(7): 96-97.
Mackenzie, C. and T. Iberall (1994). . The Grasping Hand. G. Stelmach and P. Vroon. Amsterdam, North Holland. Advances in Psychology 104.
The book is an analysis of how people use their hands.
MacKenzie, I. S. (1992). Fitts' law as a research and design tool in human-computer interaction. . Human-Computer Interaction 7: 91-139.
MacKenzie, I. S. and W. Buxton (1992). Extending Fitts' law to two-dimensional tasks. . Proc. CHI '92 Conference on Human Factors in Computing Systems : 219-226.
MacKenzie, I. S., A. Sellen, et al. (1991). A comparison of input devices in elemental pointing and dragging tasks. . Proc. CHI '91 Conference on Human Factors in Computing Systems : 161-166.
MacKenzie, I. S. and C. Ware (1993). Lag as a Determinant of Human Performance in Interactive Systems. INTERCHI'93 Conference on Human Factors in Computing Systems: 488-493.
Mackinlay, J., S. Card, et al. (1990). Rapid Controlled Movement Through a Virtual 3D Workspace. Computer Graphics 24(4): 171-176.
Magnenat-Thalmann, N. and D. Thalmann (Sept. 1991). Complex Models for Animating Synthetic Actors. IEEE Computer Graphics and Applications.
Mandler, J. M., D. Seegmiller, et al. (1977). On the Coding of Spatial Information. Memory and Cognition 5(1): 10-16.
Mapes, D. (1994). Two Handed Virtual Environment Interface, improving productivity for object manipulation and viewpoint movement in 3D worlds. . Orlando, Computer Science Department, University of Central Florida.
The latest version of this interface was demonstrated at the MultiGen booth in SIGGRAPH'95, and it was very nice... Uses two (tracked) gloves and an HMD. The gloves are the touch-pad style gloves. There are just a few simple "gestures", e.g. tapping your left hand to bring up a palette of widgets.
Mapes, D. and P. Mlyniec 3D Object Manipulation Techniques: Immersive vs. Non-immersive Interfaces. .
To my knowledge, this manuscript hasn't been published anywhere.
Mapes, D. and J. M. Moshell Two handed interface for object manipulation in Virtual Environments. .
http://www.vsl.ist.ucf.edu/~polyshop/polyshop.html has some info about the current state of this project This manuscript focuses more on registration issues (correct set-up of HMD parameters) than it does on the 2-handed interaction techniques empolyed. System uses a physical drafting table as a ground plane (& physical support) to allow constrained 2D manipulation.
Marteniuk, R. G., C. L. MacKenzie, et al. (1984). Bimanual movement control: Information Processing and interaction effects. . Quarterly Journal of Experimental Psychology 36A: 335-365.
results question the Kelso hypothesis that bimanual movement to seperate targets necessarily requires simultaneity of movement
Massimino, M. J., T. B. Sheridan, et al. (1989). One Handed Tracking in Six Degrees of Freedom. IEEE Systems Man and Cybernetics.
Reports on user experiments for controlling 1, 3, and 6 degrees of freedom at a time for "pursuit" tracking tasks with a sensor ball (apparently identical to a spaceball) as an input device. It also tests use velocity control vs. acceleration control. In all cases (1, 3, 6 DoF) z translations were the most difficult to control, and velocity input yielded better control than acceleration input. The use of shadows as depth cues did not help z translations.
McGee, M. G. (1979). Human Spatial Abilities. .
McKenna, M. (1992). Interactive Viewpoint Control and Three-dimensional Operations. Proc. 1992 Symposium on Interactive 3D Graphics.
Describes a "fish tank VR" system which changes the image on a standard 2D monitor based on head position. This allows perspective and motion parallax (monocular depth cues) without a HMD. An extension of this technique also tracks the monitor, allowing additional freedom (e.g. translation/rotation of the monitor). A Polhemus sensor is used to track the head; a stereoscopic version is described but not implemented.
McKenna, M., S. Pieper, et al. Control of a Virtual Actor: the Roach. Graphics 24(2): 165-174.
McLeod, P. ((1977). ). A dual task response modality effect: Support for multiprocessor models of attention. . Quarterly Journal of Experimental Psychology 29: 651-667.
Milgram, P., S. Zhai, et al. (Jul. 1993). Applications of Augmented Reality for Human-Robot Communication. Proc. IROS'93: IEEE/RSJ International Conf. on Intelligent Robots and Systems.
Minsky, M., M. Ouh-young, et al. (March 1990). Feeling and Seeing: Issues in Force Display. Computer Graphics 24(2): 235-244.
Haptic Displays: Great article: talks about the sandpaper project.
Monk, A. (1986). Mode Errors: A User-centered Analysis and some Preventative Measures Using Keying-contingent Sound. International Journal of Man-Machine Studies 24: 313-327.
Mosher, G., G. Sherouse, et al. (October 1986). The Virtual Simulator. Proc. 1986 ACM Workshop on Interactive 3D Graphics, Chapel Hill, NC: 37-42.
A computer model of a radiotherapy treatment planning machine and the resulting user interface is discussed. Part of the interface includes cutting plane selection. The radiotherapist uses a custom-built box which allows the user to "scale, hither, and yon" clipping planes for the displayed objects. The controls are sideways mounted pots which the user sees as wheels protruding from the box. The user just strokes each wheel in the desired direction to manipulate the cutting planes. Users found this preferable to turning labeled knobs.
Mountford, S. J. and W. Gaver, W. Talking and Listening to Computers. : 319-334.
Nakatani, L. H. and J. A. Rohrlich (1983). Soft Machines: A Philosophy of User-Computer Interface Design. Proceedings of ACM CHI'83 Conference on Human Factors in Computing Systems: 19-23.
Machines and computer systems differ in many characteristics that have important consequences for the users. Machines are special-purpose, have forms suggestive of their functions, are operated with controls in obvious one-to-one correspondence with their actions, and the consequences of the actions on visible objects are immediately and readily apparent. By contrast, computer systems are general-purpose, have inscrutable form, are operated symbolically via a keyboard with no obvious correspondence between keys and actions, and typically operate on invisible objects with consequences that are not immediately or readily apparent. The characteristics possessed by machines, but typically absent in computer systems, aid learning, use and transfer among machines. But "hard," physical machines have limitations: they are inflexible, and their complexity can overwhelm us. We have built in our laboratory "soft machine" interfaces for computer systems to capitalize on the good characteristics of machines and overcome their limitations. A soft machine is implemented using the synergistic combination of real-time computer graphics to display "soft controls," and a touch screen to make soft controls operable like conventional hard controls.
Nardi, B. A., H. Schwarz, et al. (1993). Turning Away from Talking Heads: The Use of Video-as-Data in Neurosurgery,. Proceedings of ACM INTERCHI'93 Conference on Human Factors in Computing Systems: 327-334.
Nelson, T. H. The Right Way to Think About Software Design. : 235-243.
Newell, A. . Unified Theories of Cognition, Harvard, University Press, Cambridge, MA, 1990.
Newman, W. and P. Wellner (1992). A Desk Supporting Computer-Based Interaction with Paper Documents. Proceedings of ACM CHI'92 Conference on Human Factors in Computing Systems: 587-592.
Before the advent of the personal workstation, office work practice revolved around the paper document. Today the electronic medium offers a number of advantages over paper, but it has not eradicated paper from the office. A growing problem for those who work primarily with paper is lack of direct access to the wide variety of interactive functions available on personal workstations. This paper describes a desk with a computer-controlled projector and camera above it. The result is a system that enables people to interact with ordinary paper documents in ways normally possible only with electronic documents on workstation screens. After discussing the motivation for this work, this paper describes the system and two sample applications that can benefit from this style of interaction: a desk calculator and a French to English translation system. We describe the design and implementation of the system, report on some user tests, and conclude with some general reflections on interacting with computers in this way.
Ney, D. R. and E. K. Fishman (November 1991). Editing Tools for 3D Medical Imaging. IEEE Computer Graphics and Applications.
"Designed like a paint and drawing program for 3D data sets, MPR Edit lets you interactively create shapes that define volumes of interest in images of medical data." Volumes of interest are commonly morphologically complex. Uses thresholded region-growing to select such regions. Also has a few geometric primitives. The working version deals only with 2D slices (transaxial, coronal, saggital) of the volume. One must click in the slices with the mouse to select a 3D position.
Nielsen, J. Noncommand User Interfaces. Communications of the ACM 36(4): 83-99.
A good overview of what makes new interfaces *new*. Lots of good references.
Nielsen, J. (1993). Usability Engineering, Academic Press, Inc. .
Nielsen, J. (March 1992). The Usability Engineering Life Cycle. IEEE Computer.
Nielson, G. M. and D. R. Olsen (October 1986,). Direct Manipulation Techniques for Objects Using 2D Locator Devices. Proc. 1986 ACM Workshop on Interactive 3D Graphics, Chapel Hill, NC.
Discusses a mouse-based technique ("triad mouse") for directly manipulating the perspective projection of an object. The scheme doesn't work very well when 2 projected axes approach orthogonality.
Norman, D. (1990). The Design of Everyday Things. .
Norman, D. A. (1981). Categorization of Action Slips. Psychology Review 88(1): 1-15.
Norman, D. A. (1990). Why Interfaces Don't Work. The Art of Human-Computer Interface Design. B. Laurel and S. Mountford. Reading, MA: 209-219.
Oldfield, R. C. (1971). The assessment and analysis of handedness: The Edinburgh inventory. . Neuropsychologia 9 : 97 - 113.
Describes a questionnaire for assessing handedness. The questionaire gives a continuous scale of right-handedness.
Osborn, J. and A. Agogino (1992). An Interface for Interactive Spatial Reasoning and Visualization. Proc. ACM CHI'92 Human Factors in Computing Systems Conference, ACM SIGCHI.
This paper describes a mouse-based user interface for spatial reasoning and visualization. The interface includes the ability to orient an object and select arbitrary cutting planes; this portion of the interface is discussed in considerable detail. The basic interaction metaphor is that of manipulating the object in a "pool of water," the surface of which forms the cutting plane. The user rotates the model into the desired orientation, and then adjusts the depth of the "pool" to select the depth of the cut.
Ostby, E. (October 1986). Describing Free-Form 3D Surfaces for Animation. Proc. 1986 ACM Workshop on Interactive 3D Graphics, Chapel Hill, NC.
The author investigates several uses of the polhemus tracker for specifying the surfaces of 3D objects. Uses included: * Probe: Sample probe at user signal; sample at many points in space to define an object. It was hard to locate a 3D point on a two-dimensional display. The author found that using the device in combination with a real object helped solve this problem. * Pencil: Draw lines in space, or trace a grid over the surface of an actual object. Use least-squares to fit a patch * Camera for viewing: Works well and feels natural * Tool for deforming the surface of existing objects: Use relative motion to deform. Relative motion is easier to control. Problems with the Polhemus tracker included: * Lack of a tip switch -- need equivalent of mouse click * Drawing freehand in open space is hard: no friction to facilitate control * Locating points in space with only 2D display
Overveld, C. W. A. M. v. (December 1989). Application of a Perspective cursor as a 3D locator Device. Computer-Aided Design 21(10): 619-629.
Focuses on "3D geometric design" issues for 3D input. Basis: extend 2D drafting tool (T-square) to 3D (3 orthogonal rulers).
Pausch, R. (1994). Alice User's Guide. .
Pausch, R. (March , 1994). Support for Rapid Prototyping of Two- and Three-Dimensional User Interfaces . Proposal for ARPA (Advanced Research Projects Agency) BAA 93-42, Human Computer Interaction, Computer Science Department, University of Virginia.
description of Alice
Pausch, R., T. Burnette, et al. Navigation and Locomotion in Virtual Worlds via Flight into Hand-Held Miniatures. Computer Graphics (SIGGRAPH '95).
Pausch, R., T. Crea, et al. (Summer 1992). A Literature Survey for Virtual Environments: Military Flight Simulator Visual Systems and Simulator Sickness. Presence 1(3).
Gives a quick overview of simulator research along with lots of references from military research which are very difficult to find in the academic literature. The references are annotated.
Pausch, R., M. A. Shackelford, et al. (Oct. 1993). A User Study Comparing Head-Mounted and Stationary Displays. Proc. IEEE Symposium on Research Frontiers in Virtual Reality. San Jose, CA.
Pausch, R., L. Vogtle, et al. (1992). One Dimensional Motion Tailoring for the Disabled: A User Study. Proc. ACM CHI'92 Conference on Human Factors in Computing Systems.
Pearson, G. and M. Weiser (1988). Exploratory Evaluation of a Planar Foot-Operated Cursor-Positioning Device. Proc. ACM CHI'88 Conference on Human Factors in Computing Systems: 13-18.
Perlin, K. and D. Fox (Aug. 1993.). Pad: An Alternative Approach to the Computer Interface. SIGGRAPH `93, Coomputer Graphics.
navigation through multiple levels of detail: System which allows the user to infinitely zoom in on a 2D screen.
Peters, M. (1985). Constraints in the performance of bimanual tasks and their expression in unskilled and skilled subjects. . Quarterly Journal of Experimental Psychology 37A: 171-196.
"Right handers performed dual [tapping] tasks better when the preferred hand took the `figure` and when the nonpreferred hand took the `ground` of the dual movement. [...] Subjects showed marked interdependence of movements such that performance of one hand was a function of movements in the other hand."
Peters, M. ((1981).
95-103.). Attentional asymmetries during concurrent bimanual performance. . Quarterly Journal of Experimental Psychology, 33A,.
"An asymmetry of attention was observed when subjects attempted to perform concurrent, relatively independent tasks with the two hands." The task was concurrent tapping, right handers do best when the left follows the metronome and the right taps as quickly as possible.
Petroski, H. (1989). The Pencil: A History of Design and Circumstance, Alfred A. Knopf, Inc.
Uses the pencil as a case study to look at the process of designing new artifacts in general.
Phillips, C., N. Badler, et al. Automatic Viewing Control for 3D Direct Manipulation. Computer Graphics (Proc. 1992 Symposium on Interactive 3D Graphics).
Phillips, C. B. and N. I. Badler (1988). Jack: A Toolkit for Manipulating Articulated Figures. Proc. ACM SIGGRAPH Symposium on User Interface Software Technology: 221-229.
Pieper, S., J. Rosen, et al. (1992). Interactive Graphics for Plastic Surgery: A Task-level Analysis and Implementation. Proc. 1992 Symposium on Interactive 3D Graphics.
"It is crucial that the user interface to the system not burden the physician with the implementation details of the computational model... the surgeon only deals directly with the problems associated with the task." Simulate what is actually done during surgery to provide an intuitive interaction (e. g. drawing on surface of skin). Allow surgeon interactive approach to the planning problem. Describes some algorithms for implementing gravity functions. The remainder of the paper discusses mesh generation for finite element analysis.
Pique, M. E. (October 1986). Semantics of Interactive Rotation. Proc. 1986 ACM Workshop on Interactive 3D Graphics. Chapel Hill, NC: 259-269.
Polhemus Navigation Sciences, I. .
Isotrak, Fastrak input devices
Polhemus Navigation Sciences, I. and J. Kulpers (1977). Apparatus for Generating a Nutating Electromagnetic Field. .
Poston, T. and L. Serra (May 1996). Dextrous Virtual Work. Communications of the ACM 39(5): 37-45.
"A system for visualizing and manipulating medial images is detailed, with emphasis on interaction techniques." Uses a mirrored set up (opaque mirror, not half-silvered) with stereoscopic display. The mirror is up relatively near your face, leaving a large work volume for the hands behind the mirror. Not a major point of the paper, but the system employs two-handed interaction: rotation of a brain image with the left hand and fine manipulation with the right hand, using a physical tool handle that has multiple virtual effectors.
Poston, T., L. Serra, et al. Interactive Tube Finding on a Virtual Workbench. : 119-123.
Poulton, E. C. and P. R. Freeman (July 1966). Unwanted asymmetrical transfer effects with balanced experimental designs. Psychlogical Bulletin 66(1).
Proffitt, D. .
overview of Visual Frames-of-Reference research, and notes on the two separate visual systems.
Provins, K., A. Milner, et al. (1982). Asymmetry of Manual Preference and Performance. Perceptual and Motor Skills 54: 179-194.
Provins, K. A. and D. J. Glencross (1968). Handwriting, typewriting and handedness. . Quarterly Journal of Experimental Psychology 20: 282-289.
Studies how adeptly subjects use their left and right hands in handwriting and typewriting tasks. For trained typists, there was either no difference in performance between hands or an advantage for the left hand. Non-typists performed better with the right hand.
Provins, K. A., A. D. Milner, et al. (1982). Asymmetry of Manual Preference and Performance. Perceptual and Motor Skills 54: 179-194.
Discusses validity of handedness measures
Pruyn, P. W. and D. P. Greenberg (May 1993). Exploring 3D Computer Graphics in Cockpit Avionics. IEEE Computer Graphics & Applications.
Qualisys, I. .
2D and 3D camera tracking (geared towards motion capture / gait analysis)
Quinn, J. T., H. N. Zelaznik, et al. ((1980).). Target-size influences on reaction time with movement time controlled. Journal of Motor Behavior 12: 239-261.
R., G., D. Proffit, et al. (1994. ). A Hill Study: Using a Virtual Environment as a Perceptual Psychology Laboratory. .
Raab, F. H., E. B. Blood, et al. (September 1979). Magnetic Position and Orientation Tracking System. IEEE Transaction on Aerospace and Electronic Systems AES-15(5): 709-717.
Describes the theoretical underpinnings of the Polhemus tracker. Also talks about application considerations, source/sensor imperfections, and the problems caused by nearby metallic structure. Rule of thumb for metal: "An object whose distance from the source is at least twice the distance separating the source and sensor produces a scattered field whose magnitude is 1 percent or less of the magnitude of the desired field."
Regian, J., W. Shebilske, et al. (1992). Virtual Reality: An Instructional Medium for Visual-Spatial Tasks. Journal of Communication 42(4): 136-149.
Reinhart, W. F. and C. J. C. Lloyd (1994). A Human Factors Simulation Tool for Stereoscopic Displays. Proceedings of the HFES.
Describes an apparatus which can simulate active matrix driven liquid crystal displays, for rapid prototying of head-mounted displays.
Robertson, G. G., S. K. Card, et al. (1989). The Cognitive Coprocessor Architecture for Interactive User Interfaces . Proc. ACM SIGGRAPH/SIGCHI 1989 Symposium on User Interface Software and Technology: 10-18.
Describes a software architecture which is appropriate for the real-time demands of 3D interactive applications, including animation. There are two problems: * Multiple Agent Problem: UI must match "time constants" of human and computer. The architecture must "manage the interactions of multiple asynchronous agents that can interrupt and redirect each other's work." * Animation Problem: Interactive animation can shift the user's task from cognitive to perceptual, which fees cognitive ability. Animation of motion allows a continuity of perception; discontinuous motion requires reassimilation of the new display. The paper advocates a three agent model: User, user discourse machine, and task machine. The cognitive coprocessor is a UI architecture which supports this model, plus "intelligent" agents and smooth animation. The animation loop (on the user discourse machine) is the basic control mechanism. It maintains a task queue (pending computations from agents), a display queue (pending instructions from against for how the screen should be painted on the next animation loop cycle), and a governor (keeps track of time & allows for adjustments to animations to keep them smooth). The paper goes on to describe a "3D Rooms" example based on this architecture.
Robinett, W. (Spring 1992). Synthetic Experience: A Proposed Taxonomy. Presence 1(2).
Rossignac, J., A. Megahed, et al. Interactive Inspection of Solids: Cross-sections and Interferences. Proc. ACM SIGGRAPH '92, Computer Graphics, 26 (2), July 1992: 353-360.
Rowberg, A. H. (June 1992). The Use of a Free-Space Mouse in Controlling a Medical Image Viewing Display. Proc. S/CAR'92: Symposium for Computer Assisted Radiology.
Sachs, E., A. Roberts, et al. (November 1991). 3-Draw: A Tool for Designing 3D Shapes. IEEE Computer Graphics and Applications.
This paper describes a system for "sketching" in three dimensions using a pair of Polhemus 3Space trackers. A palette is held in one hand. A stylus held in the other hand is moved relative to it, allowing the user to sketch curves in 3D. Thus the interaction is based on two-handed manipulation of tools or "props."
Schmandt, C. M. Spatial Input/Display Correspondence in a Stereoscopic Computer Graphic Work Station. Proc. ACM SIGGRAPH '83 17(3): 253-262.
This paper describes a work station designed to allow interaction with spatial correspondence between the input (Polhemus) and output (stereoscopic display) devices. The workspace consists of a monitor mounted at a 45 degree angle and a half-silvered mirror, beneath which the user holds the "wand." This set-up mixes the computer graphics and the user's hand into a single image. Pure binocular convergence was found to lack sufficient depth cues. A combination of convergence, obscurations, luminance, and size give a strong 3D sense, but no factor alone was adequate. Schmandt reports that a significant problem was lack of depth judgement. Occlusion cues were misleading, as the user could always see their hand through the semi-transparent graphics.
Schulman, A. I. (1973). Recognition Memory and the Recall of Spatial Location. Memory and Cognition 1(3): 256-260.
Sears, A., C. Plaisant, et al. (1992). A New Era for High Precision Touchscreens. Advances in Human-Computer Interaction. Hartson and Hix. 3: 1-33.
Good summary paper of touchscreen technologies and future directions; includes pointers to all the relevant research.
Sears, A. and B. Shneiderman (1991). High Precision Touchscreens: Design Strategies and Comparisons with a Mouse. International Journal of Man-Machine Studies 34(4): 593-613.
Three studies were conducted comparing speed of performance, error rates and user preference ratings for three selection devices. The devices tested were a touchscreen, a touchscreen with stabilization (stabilization software filters and smooths raw data from hardware), and a mouse. The task was the selection of rectangular targets 1, 4, 16 and 32 pixels per side (0.4 x 0.6, 1.7 x 2.2, 6.9 x 9.0, 13.8 x 17.9 mm respectively). Touchscreen users were able to point at single pixel targets, thereby countering widespread expectations of poor touchscreen resolution. The results show no difference in performance between the mouse and touchscreen for targets ranging from 32 to 4 pixels per side. In addition, stabilization significantly reduced the error rates for the touchscreen when selecting small targets. These results imply that touchscreens, when properly used, have attractive advantages in selecting targets as small as 4 pixels per size (approximately one-quarter of the size of a single character). A variant of Fitts' Law is proposed to predict touchscreen pointing times. Ideas for future research are also presented.
Self, H. C. (May 1986). Optical Tolerances For Alignment and Image Differences For Binocular Helmet-Mounted Displays. .
Sellen, A. J., G. P. Kurtenbach, et al. (1990). The Role of Visual and Kinesthetic Feedback in the Prevention of Mode Errors. Proc. IFIP INTERACT'90: Human-Computer Interaction: 667-673.
Studies the effectiveness of various types of feedback in the prevention of user errors. Keyboard vs. Foot Pedal for changing mode crossed with presence or absence of visual feedback. A "distractor task" was also running to allow the measurement of "resume times" and "service times." Mode errors: Experts made more errors than novices (experts good at recovery). Kinesthetic feedback helped everyone: "even though many of the expert subjects [...] were used to keeping track of the mode 'in their head', feedback still significantly reduced their mode errors."
Shaw, C. and M. Green Two-Handed Polygonal Surface Design. ACM UIST'94 Symp. on User Interface Software & Technology: 205-212.
Describes a system which uses two hand-held trackers (augmented with 3 buttons each) to perform CAD tasks. The dominant hand performs picking and manipulation, the non-dominant hand context setting.
Shaw, C., J. Liang, et al. (1992). The Decoupled Simulation Model for Virtual Reality Systems. Proc. ACM CHI'92 Conference on Human Factors in Computing Systems.
Describes the MR toolkit
Shepard, R. N. and J. Metzler (1971). Mental Rotation of Three-Dimensional Objects. Science 171.
Describes a classic experiment in mental rotation. The subject is presented with a pair of images, which are either (rotated) images of the same object, or images of objects which cannot be rotated into agreement. The experiment times how long it takes the subject to decide whether the objects are the same or different. Shepard and Metzler found that the time to make the decision was linear with the angle of rotation for "same" objects (the angle of rotation is undefined for "different" objects).
Sheridan, M. R. (1979). A Reappraisal of Fitts' Law. Journal of Motor Behavior, 11: 179-188.
Sheridan, M. R. ((1973). ). Effects of S-R compatibility and task difficulty on unimanual movement time. Journal of Motor Behavior, 5,: 199-205.
Shoemake, K. Arcball Rotation Control. Graphics Gems IV. P. S. Heckbert, Academic Press, Inc., 955 Massachusetts Avenue, Cambridge, MA 02139.: 175-192.
Shoemake, K. (1992). ARCBALL: A User Interface for Specifying Three-Dimensional Orientation Using a Mouse. Graphics Interface: 151-156.
Describes a 2D interface for 3D orientation. The key is that mouse motion is consistently interpreted as a half-arc length rotation on an imaginary sphere, resulting in an interface free from hysterisis. A circle is drawn around the object being rotated; rotation about the axis perpendicular to the screen is handled by moving the mouse in the region outside of this circle. The paper also demonstrates how to add constrained rotations to the technique.
Shoemake, K. (1992). Matrix Animation and Polar Decomposition. Graphics Interface '92: 258-264.
Shoemake, K. (July 1985). Animating Rotations with Quaternion Curves. Computer Graphics 19(3): 245-254.
A very good reference for learning about quaternions.
Shoemake, K. (May 1994). Quaternions. .
Tutorial material on quaternions, including C code for implementing common quaternion operations.
Smith, T. J. and K. U. Smith (1987). Feedback-Control Mechanisms of Human Behavior. Handbook of Human Factors. G. Salvendy. New York, John Wiley & Sons: 251-293.
Snibbe, S., K. Herndon, et al. (July 1992). Using Deformations to Explore 3D Widget Design. Computer Graphics (Proc. ACM SIGGRAPH '92) 26(2): 351-352.
So, R. H. Y. and M. J. Griffin Effects of Time Delays on Head Tracking Performance and the Benefits of Lag Compensation by Image Deflection. .
Spaceball Technologies, I. .
Spectra Symbol, I. .
Membrane switch technology, membrane & mechanical slide potentiometers, x-y touchpads, custom designs
Stassen, H. G. and G. J. F. Smets (June 27-29, 1995). Telemanipulation and Telepresence. 6th IFAC/IFIP/IFORS/IEA Symposium on Analysis, Design, and Evaluation of Man-machine Systems: 13-23.
A survey of 3D manipulation and perception work from the teleoperation point of view. Touches on: 3D perception (Softenon children: without manipulation, children don't develop 3D perception), theories of perception, television (adapting to teleoperation tasks), telemanipulation: handedness, field of view & depth, time delay, implementation. Interesting work in handedness has been done in the rehabilitation field, esp. in design of arm prostheses.
CrystalEyes stereoscopic glasses
Stoakley, R., M. Conway, et al. Virtual Reality on a WIM: Interactive Worlds in Miniature. CHI'95: 265-272.
Describes the Worlds in Miniature interface metaphor. Augments an immersive head tracked display with a hand held miniature copy of the virtual environment; there is a 1:1 relationship between life-size objects in the virtual world and miniature objects on the hand-held miniature world.
Strauss, P. S. and R. Carey (July 1992). An Object-Oriented 3D Graphics Toolkit. Computer Graphics (Proc. ACM SIGGRAPH '92) 26(2): 341-349.
Sturman, D., D. Zeltzer, et al. (1989). Hands-On Interaction with Virtual Environments. Proc. ACM Symposium on User Interface Software and Technology: 19-24.
Suetens, P., D. Vandermeulen, et al. (January, 1988). A 3-D Display System with Stereoscopic, Movement Parallax and Real-time Rotation Capabilities. Proc. SPIE-Medical Imaging II: Image Data Management and Display. Newport Beach, CA. 914: 855-861.
Reports a technique for displaying dynamic images on a 2D monitor by tracking the user's head. Many details of the implementation are not given. The authors also suggest adding hand-guided rotations to allow the user to move completely around objects.
Surles, M. Interactive Modeling Enhanced with Constraints and Physics with Applications in Molecular Modeling . Computer Graphics (Proc. 1992 Symposium on Interactive 3D Graphics).
Sutherland, I. E. (1965). The Ultimate Display. Proc. IFIP Congress. 65: 505-508.
Sutherland, I. E. (1968). A Head-mounted Three Dimensional Display. Proc. the Fall Joint Computer Conference: 757-764.
The seminal VR paper -- discusses technical details of Sutherland's original see-through display system
Sutherland, I. E. (1974). Three-Dimensional Data Input by Tablet. Proceedings of the Institute for Electronic and Electrical Engineers. 62: 453-71.
Takemura, H., A. Tomono, et al. (June 1988). An Evaluation of 3-D Object Pointing Using a Field Sequential Stereoscopic Display . Proc. Graphics Interface '88 (Edmonton, Alberta): 112-118.
Describes user experiments (six subjects) to measure performance in 3-D object pointing with stereoscopy.
Tani, M., K. Yamaashi, et al. (1992). Object-Oriented Video: Interaction with Real-World Objects through Live Video. Proceedings of ACM CHI'92 Conference on Human Factors in Computing Systems: 593-598.
Graphics and live video are widely employed in remotely-controlled systems like industrial plants. Interaction with live video is, however, more limited compared with graphics as users cannot interact with objects being observed in the former. Object-Oriented Video techniques are described allowing object-oriented interactions, including the use of real-world objects in live video as reference cues, direct manipulation of them, and graphic overlays based on them, which enable users to work in a real spatial context conveyed by the video. Users thereby understand intuitively what they are operating and see the result of their operation.
Tarlton, M. A. and P. N. Tarlton (March, 1992). A Framework for Dynamic Visual Applications. 1992 Symposium on Interactive 3D Graphics: 161-164.
describes the Mirage system, a precursor to Inventor
Taylor, R. M. I., W. Robinett, et al. The Nanomanipulator: A Virtual-Reality Interface for a Scanning Tunneling Microscope. Computer Graphics (SIGGRAPH 93 Proceedings): 127-134.
Tharp, G., A. Liu, et al. (1992). Timing Considerations of Helmet Mounted Display Performance. SPIE Conference on Human Vision, Visual Processing, and Digital Display . San Jose, CA.
Thorton, R. (1979). The number wheel: a tablet-based valuator for interactive three-dimensional positioning. Computer Graphics 13(2): 102-107.
Todor, J. I. and T. Doane ((1978). ). Handedness and hemispheric asymmetry in the control of movements. . Journal of Motor Behavior 10, : 295-300.
Left hand advantage shown for task w/ large amplitude & width. (Fitts tapping task)
Tognazzini, B. (1993). Principles, Techniques, and Ethics of Stage Magic and Their Application to Human Interface Design . Proceedings of ACM INTERCHI'93 Conference on Human Factors in Computing Systems: 355-361.
Torborg, J. and J. Kajiya Talisman: Commodity Realtime 3D Graphics for the PC. Computer Graphics (Proc. SIGGRAPH'96).
Touch, C. .
Tufte, E. R. . The Visual Display of Quantitative Information.
Tufte, E. R. Envisioning Information. .
Turner, R., F. Balaguer, et al. (1991). Physically-based interactive camera motion using 3d input devices. Scientific Visualization of Physical Phenomena: Proceedings of CG International Tokyo .
van Rossum, G. (1993). An Introduction to Python for UNIX/C Programmers. .
I'm not sure where this has been published.
Vannier, M. W., J. L. March, et al. (July 1983). Three-dimensional Computer Graphics for Craniofacial Surgical Planning and Evaluation. Computer Graphics 17(3): 263-273.
Describes system for 3D reconstruction of CT slices (automatic edge detection in slices). System "improved quantitative post-operative evaluation in more than 200 cases." The system stresses efficient time and storage, so that the software can be added to virtually any CT scanner. Uses a very simple (but fast) segmentation algorithm. Uses CAD-like package to plan surgical procedures; linear/volumetric measurements can be derived.
Venolia, D. (1993). Facile 3D Direct Manipulation. INTERCHI'93 Conference on Human Factors in Computing Systems: 31-36.
Verplank, B. Manipulation in Virtual Environments. .
Position Statement for CHI'96 Workshop on Manipulation in Virtual Environments. Talks about some projects at Interval Research in haptic feedback (using Phantom arms from SensAble Devices).
Visage, I. .
Touchmate : senses touch through a force plate. Can sense touch on monitors, or any other object that you can manage to mount to the force plate. For monitors, allows touch sensitivity on the physical bezel of the monitor, and not just on the screen.
von Wright, J., P. Gebhard, et al. (1975). A Developmental Study of the Recall of Spatial Location
Journal of Experimental Child Psychology. 20: 181-190.
Wang, C., L. Koved, et al. Design for Interactive Performance in a Virtual Laboratory. Computer Graphics 24(2): 39-40.
Describes IBM's VR research (the "Rubber Rocks" system). "With current state of technology, the glove and tracking devices can generate much more data than the graphics update process can utilize." Basic problem is to match rate of incoming data with update rate of graphics. Attempts to predict the future position of the hand. "During rapid acceleration or deceleration, the predicted position tends to overshoot the actual position and converges in two to three update cycles."
Ware, C. (1990). Using Hand Position for Virtual Object Placement. Visual Computer 6 (5): 245-253.
This paper describes two experiments which investigate the use of six degree of freedom digitizers (Polhemus) to manipulate 3D virtual environments. Specifically, the experiments test the speed and accuracy of placing an object in space with the correct orientation. Motions always had a total magnitude of 9.5 cm. Four subjects participated in the study. In the first experiment subjects were told to position the object (both position and orientation) as accurately as possible. Four conditions were tested: z translation enabled, z disabled, stereo, no stereo. Enabling z translations slowed accurate placement: 25% with stereo, 53% without stereo. Overall, the placement times with stereopsis were 39% faster. In the second experiment, subjects were told to make the placement as quickly as possible. Times to position, orient, or (simultaneously) position and orient were tested. Ware found that subjects were able to make effective use of all six degrees of freedom (that is, time for simultaneous positioning & orientation was less than the time for separate positioning and orientation). Disabling the z translations hindered rapid placement. Stereopsis still helped. Subjects did not report fatigue with the Polhemus. Ware states this is because it was used as a relative positioning device.
Ware, C., K. Arthur, et al. (1993). Fish Tank Virtual Reality. Proceedings of ACM INTERCHI'93 Conference on Human Factors in Computing Systems: 37-41.
Ware, C. and C. Baxter (1989). Bat Brushes: on the Uses of Six Position and Orientation Parameters in a Paint Program. Proc. ACM CHI'89 Conference on Human Factors in Computing Systems: 155-160.
Ware, C. and D. R. Jessome (November 1988). Using the Bat: A Six-Dimensional Mouse for Object Placement. IEEE Computer Graphics and Applications.
Reports on experiments with a polhemus tracker. Summary of some interesting points: It is essentially impossible to achieve precise positioning using a 1:1 control ratio when the arm/hand is unsupported. Rotations of the Polhemus produce inadvertent translations. Interaction techniques which require the user to precisely control both sets of parameters simultaneously are "generally confusing." Uses "ratcheting" for large translations or rotations: a button on the bat acts as a clutch allowing or disallowing movement.
Ware, C. and S. Osborne Exploration and Virtual Camera Control in Virtual Three Dimensional Environments. Proc. 1990 Symposium on Interactive 3D Graphics 24(2): 175-183.
Discusses basic Interaction Paradigms for 3D data. Metaphors: eyeball in hand, Scene in hand, Flying vehicle control. Studies where they are appropriate.
Ware, C. and L. Slipp (Sep. 1991). Exploring virtual environments using velocity control: A comparison of three devices. Proc. Human Factors Society (HFS), 35th Ann. Mtg. (San Francisco).
Compares Polhemus, Spaceball, and mouse-based interfaces. Spaceball yielded worst performance. Some users complained of fatigue after prolonged use of the Polhemus, but it still yielded the best results.
Weimer, D. and S. K. Ganapathy (1989). A Synthetic Visual Environment with Hand Gesturing and Voice Input. Proc. ACM CHI'89 Conference on Human Factors in Computing Systems.
Talks about glove + voice input. Their focus is on development of synthetic environment interaction techniques, as a vehicle for experimenting with more natural 3D interfaces. A table top is used as a workspace, giving a place to rest the hands, and also providing a sort of "natural" tactile feedback when "buttons" are pressed on a menu in the synthetic space. A standard monitor is used for display. Speech input was added to the interface for three reasons: (1) people tend to use gestures to augment speech, (2) spoken vocabulary has a more standard interpretation than gestures, (3) hand gesturing and speech complement one another. Voice is used for navigating through commands, while hand gestures provide "shape" information. "There was a dramatic improvement in the interface after speech recognition was added." A thumb gesture is used as a clutching mechanism to avoid uncomfortable hand positions. The driving application is a 3D modeling system for free-form surfaces.
Weiser, M. (Sept. 1991). The Computer for the 21st Century. Scientific American 265(3): 94-104.
Wellner, P. (1991). The DigitalDesk Calculator: Tangible Manipulation on a Desk Top Display. Proc. ACM SIGGRAPH Symposium on User Interface Software and Technology: 107-115.
Wellner, P. (1993). Interacting with Paper on the DigitalDesk. Communications of the ACM 36(7): 87-97.
Overhead projector projects paper on to the desk surface; a camera system tracks where your hand is positioned. A good paper.
Wells, M. J. and M. J. Griffin (March/April 1987). A Review and Investigation of Aiming and Tracking Performance with Head-Mounted Sights
IEEE Transactions on Systems, Man, and Cybernetics, . SMC-17(2): 210-221.
Wing, A. M. ((1982). ). Timing and co-ordination of repetitive bimanual movements. . Quarterly Journal of Experimental Psychology 34A,: 339-348.
looks at timing mechanism for subjects to make repetitive bimanual movements of unequal difficulty
Wolff, P. H., I. Hurwitz, & , et al. ((1977). ). Serial organization of motor skills in left- and right-handed adults. . Neuropsychologia, 15,: 539-546.
Studies finger tapping task for both hands. rhythmic patters tapped with greater precision in right vs. left hand.
Yamaashi, K., J. Cooperstock, et al. Beating the Limitations of Camera-Monitor Mediated Telepresence with Extra Eyes. Proc. ACM CHI'96 Conference on Human Factors in Computing Systems.
Yoo, T. S., U. Neumann, et al. (July 1992). Direct Visualization of Volume Data. IEEE Computer Graphics and Applications.
This paper describes an interactive environment for volume visualization. The authors claim that existing diagnostic paradigms require some manipulation of cutting plane surfaces. The paper mostly discusses software architectures for taking advantage of Pixel Planes 5, and the various evolutions of the volume rendering software for that machine. It also discusses syntactic and semantic (user-guided) data segmentation.
Zeleznik, R., K. Herndon, et al. SKETCH: An Interface for Sketching 3D Scenes. Proc. SIGGRAPH'96: 163-170.
Zeleznik, R. C., D. B. Conner, et al. (1991). An Object-Oriented Framework for the Integration of Interactive Animation Techniques. Computer Graphics (Proc. ACM SIGGRAPH '91) 25(4): 105-111.
Zeltzer, D., S. Pieper, et al. (June 1989). An Integrated Graphical Simulation Platform. Proc. Graphics Interface '89: 266-274.
Zhai, S. (1993). Investigation of Feel for 6DOF Inputs: Isometric and Elastic Rate Control for Manipulation in 3D Environments. Proceedings of the Human factors and Ergonamics Society 37th annual meeting.
Zhai, S. (1995). Human Performance in Six Degree of Freedom Input Control, University of Toronto.
Zhai, S. (Sep 1993). Human Performance Evaluation of Manipulation Schemes in Virtual Environments. Proc IEEE Virtual Reality International Symposium (VRAIS).
Overview of Zhai's taxonomy & his experimental results
Zhai, S., W. Buxton, et al. The Partial Occlusion Effect: Utilizing Semi-transparency for Human Computer Interaction. ACM Transactions on Human-Computer Interaction (to appear).
Currently available at http://vered.rose.utoronto.ca/people/shumin_dir/SILK/silk.html
Zhai, S., W. Buxton, et al. (1994). The "Silk Cursor": Investigating Transparency for 3D Target Acquisition. Proceedings of ACM CHI'94 Conference on Human Factors in Computing Systems. 1: 459-464.
This study investigates dynamic 3D target acquisition. The focus is on the relative effect of specific perceptual cues. A novel technique is introduced and we report on an experiment that evaluates its effectiveness. There are two aspects to the new technique. First, in contrast to normal practice, the tracking symbol is a volume rather than a point. Second, the surface of this volume is semi-transparent, thereby affording occlusion cues during target acquisition. The experiment shows that the volume/occlusion cues were effective in both monocular and stereoscopic conditions. For some tasks where stereoscopic presentation is unavailable or infeasible, the new technique offers an effective alternative.
Zhai, S. and P. Milgram Input Techiniques for HCI in 3D Environments. ACM CHI'94 (Poster Session): 2 pp.
Zhai, S. and P. Milgram (1993). Human Performance Evaluation of Isometric and Elastic Rate Controllers in a 6DoF Tracking Task. Proc. SPIE Telemanipulator Technology. SPIE vol. 2057.
Zhai, S., P. Milgram, et al. The Effects of Using Fine Muscle Groups in Multiple Degree-of-Freedom Input. Proc. ACM CHI'96 Conference on Human Factors in Computing Systems: 308-315.
Zimmerman, T., J. Lanier, et al. A Hand Gesture Interface Device. Proc. of ACM CHI+GI'87 Conference on Human Factors in Computing Systems and Graphics Interface: 189-192.
Zimmerman, T., J. R. Smith, et al. Applying Electric Field Sensing to Human-Computer Interfaces. Proc. ACM CHI'95 Conference on Human Factors in Computing Systems: 280-287.