Eye Tracking
In the context of autostereoscopic technology, eye tracking (or face tracking) is the process of determining a viewer's eye position relative to the display. This is typically achieved using one or two cameras that scan the viewer's face, with AI algorithms then pinpointing the precise location of each eye.
Use for 3D Displays: Expanding the "Sweet Spot"
The primary application of eye tracking is to overcome a major limitation of glasses-free 3D: the narrow sweet spot (or viewing zone). In early devices, this zone was static, forcing the viewer to remain in a specific position to perceive a good 3D effect.
Modern systems use slanted lenticular lenses that can redirect this viewing zone. When paired with an eye-tracking system, the display can dynamically steer the sweet spot in real-time to follow the viewer's movements. This ensures the left-eye image is always directed to the left eye and the right-eye image to the right eye, regardless of the viewer's position.
The perceptual result is a dramatically expanded sweet spot, giving the viewer freedom of movement and fundamentally removing one of the biggest drawbacks of traditional autostereoscopic displays.
Enabling Realistic 3D Rendering
A more advanced application of eye tracking is the creation of a truly realistic, "window-like" 3D experience. In a real-time rendering engine, the precise eye-position data is used to calculate the exact camera frustums for the left and right eyes.
By feeding this data into the virtual cameras, the engine renders perspective-correct images for the viewer's specific vantage point. This means virtual objects exhibit motion parallax and perspective shifts exactly as real objects would, creating a perfect 1:1 illusion of depth.
While this requires significant graphical processing power for real-time rendering, the effect is transformative: the display ceases to be a screen and behaves as a genuine window into a 3D world.