From: owner-3dui@hitl.washington.edu on behalf of Jeff Pierce [jpierce@cs.cmu.edu] Sent: Wednesday, February 14, 2001 12:59 PM To: 3d-ui@hitl.washington.edu Subject: RE: input and output hardware, Result: Holographic teleconferencing Yup, UNC is working on this as part of their Office of the Future initiative (Henry Fuchs is lead on this). UNC is part of the NTI, that's why you saw their work in that context. Takeo Kanade has been working on a similar system at CMU. His Virtualized Reality project was the basis of CBS' implementation of EyeVision for the Super Bowl. Jeff At 09:46 AM 2/14/01, Matthew Pilgrim wrote: I attended a lecture at The University College of London (UCL) where the speaker mentioned just such an application. Apparently a big problem is flat featureless surfaces, this can be overcome by using a mixture of ambient and differed (imperceptible and structured) light which allows the cameras to extract the data. The project described was a part of National Teleimmersion Initiative (see http://www.advanced.org/teleimmersion.html) and designed to adequately test the capacity of The Internet2 (see http://www.internet2.edu/). I hope this is useful. Regards, Matthew Pilgrim Arup Research & Development -----Original Message----- From: shogunx [mailto:shogunx@operamail.com] Sent: 14 February 2001 08:08 To: Joseph LaViola Cc: 3d-ui@hitl.washington.edu Subject: Re: input and output hardware Hello, I haven't yet had the time or ready resources to fully develop this idea, but I'll throw it out anyway: Input devices--- Array of synchronized cameras/infrared sensors linked to camera locations in a 3d scene providing realtime user input in the form of a 3d mesh, which could then be procedurally textured from camera input bitmaps. Drop the polys a bit and its suitable for network transport. Output could be achieved via voxel technology. Result: Holographic teleconferencing... as a sample app. Enjoy, Scott On Tue, 13 Feb 2001, Joseph LaViola wrote: > Hi there. > > I am going to be giving a course with Doug Bowman, Ivan Poupyrev, and > Mark Mine at Siggraph 2001. I am going to be reviewing I/O hardware > for 3D user interfaces. I wanted to get a feel for what people on this list > are using in their work. Are any of you using any new or interesting > input or output devices? Do you know of anyone who has built any new > input and/or output devices? > > If so, send me a quick note with any information (papers, URLs, etc..). > > > Thanks, > > Joe > > -- Time and spacedripped dense together as crystalline visions coalesced into matter before me, folding lucid blue, shining, then inverted sleekly into a polymer hallucination of totality. Soft faces rose from the surface of pooled reality, it now only a minute portion if perception, a slim sliver of selective imagery, to speak great questions of their existence.