TELEPERCEPTION - technology
yuri(RR): harald mayer, gerd trautner, alexander barth
deutsch



The information on this site is grown in an uncommercial project and is for privat, uncommercial use only. For commercial use ask : yuri@yuri.at

technical help:Gerd.Trautner@mond.at


Visual section:
There are different ways, how stereoscopy the human eye can be pretended. In all cases images of a scene with two, in the viewing distance and angle positioned cameras, are created and these are joined over a certain technique to one sinal or one image.

The i-glasses manufactured by the company virtual-io operate in the "interlaced mode" i.e. they divide the Composit PAL (or NTSC) signal into even and odd field, whereby in every case one field is shown the left and the other field the right eye over an own LCD.
in the visual area, the following sections wait for the implementation:
- to position two ccd cameras in the viewing distance and in an select angle
- to mix the signals output of the cameras into interlaced video
- this signal has to be transmitted

After this we came to the following problems:
- if you want to mix two composit signals by field into one, the signals must be absolutely synchronous. if one camera is ready with one field, the second camera must be ready with one field, too.
- a flicker-free switching between the individual signals is provided by short switching times
- it would be of advantage to know when which field is sent. It could be e.g. that the field of the right camera is transmitted on the left eye.

Implementation of the visual section:
Synchonisation two cameras can one in two types essentially execute.
- one buys expensive, heavy Camcorder, which possess a synchronisation input
- one sets the cameras on a creative, idividuelle and cheap solution and already synchronise them over their own internal oscillator. This enables also the use of light ccd cameras. Our idea is to deactivate the oszillator of one camera and to feed it with the oszillator frequenzy of the other camera. If the cameras are identically constructed, one can assume the two cameras record absolutely synchronously the same field. Over a Synchronisationsseperator (we decided for the LM1881N of National Semiconductors) one receives the vertical synchonisation signal from a composit signal (speak field synchronisation signal) and the odd/even signal. then you have to switch between the two signals via a video switch(TEA2014 von Thomson) to mix the two composit signals to one.
Circuit as JPG

The video data radio transmission takes place after arrangement with Time'sUp in the framework from Closing the loop.


Audio section:


Tracker data acquisition:
The tracker that i-glasses operates relative to the earth's magnetic field and supplies the three absolute angles of the tracker coordinate system to that of the earth's magnetic field. The transfer is serial, whereby alternatively the angles either when ASCII or as binary will be transferd.
For us the visualization of the armchair appeared quite suitable and optically delightful in vrml and the integration of the tracker data by over Java. After training into the subject JAVA and serial interface as well as JAVA and vrml, it was clear for us, that we entered absolute researcher area, because there are only alpha versions of classes and only limited functions of the serial interface implementation in java (javax.*).
The implementation looks now as follows:
- with the help of the javax classes (Communication API) the stand allone program picks the tracker data out polled and sends it vie UDP to a UDP server, which passes the data on at Client applet, which processes them and passes the data on to the VRML model. The connection via UDP is therefore, since the serial interface in the Netscape Communicator with JDK 1,1,5 - alpha (necessary for Communication API) does not initialize, the interface to vrml however only works over an JAVA applet.

SOURCE of the stand alone program
SOURCE of the UDP server
SOURCE of the UDP server Thread

- the tracker data are analysed and fed over the Cosmo Player 2.0 classes real time into the VRML model. This works over the direct modification of individual Nodes in the VRML file.

SOURCE of Client applet
VRML file