GDC – Thursday – Abrash VR

Quake was the Metaverse coming to life, 1996. But VR stalled. Now, in 2013 VR might be on the edge of coming into existence.

VR versus AR (augmented reality). Because – flat-panel displays, wireless, cameras, gyroscopes, microprojectors. Tracking hardware, waveguides, computer vision.

Oculus Rift development kit. Wide field of view, lightweight, affordable. Many years to refine VR, and AR is even harder. 1K by 1K display across 100 degree field of view means actual few pixels in phone-sized view field.

Tracking, latency and realistic perceptions indistinguishable from reality.

VR displays move relative to both RR and eyes, your head can move very fast, and your eyes can counter rotate very fast. 20 degrees in less than 100 milliseconds.

Tracking: head position and orientation (pose). Our perceptual system is tuned for anomalies. Tracking has to be super-accurate. 1 millimeter at 2 meters from the sensor.

Rift – inertial measurement unit. Inexpensive and lightweight, but drifts and does not support translation, only rotation.

Latency – delay between head motion and virtual world update reaching eyes. Head movement magnifies anomalies, destroys reality illusion and creates vertigo and nausea. Google “Carmack Abrash latency”.

Space-time diagrams on Abrash blog. Pixel-based movement means step motion. Temporal sampling. Color fringing when eyes track pixel movement. Cinematography term – judder.

We need 1000 to 2000 frames per second. Judder can be eliminated with zero persistance. Zero persistance works for images that the eye is tracking, but backgrounds have bad problems.

http://blogs.valvesoftware.com/abrash

 

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>