|کد مقاله||کد نشریه||سال انتشار||مقاله انگلیسی||ترجمه فارسی||نسخه تمام متن|
|4952867||1442760||2017||11 صفحه PDF||سفارش دهید||دانلود کنید|
- A non-parametric camera-based calibration for optical see-through glasses is proposed.
- During the calibration process every single pixel is mapped with a viewing ray.
- From this mapping a distortion correction map and an optimized frustum is calculated.
- The calibration process is completely automated using translation stages and cameras.
- Demonstration of the previous triangulation-based method with different glasses.
Precisely calibrated optical see-through glasses are the key for many augmented reality applications. Existing works divide the system consisting of a user wearing glasses into a display part and an eye part. The former is calibrated once whereas the eye is adapted continuously. Previous methods either model the optical system using perspective projection or triangulate the pixels' virtual positions. Parametric models are often inadequate in the edge areas of the displays and based on only a few non-equidistant point correspondences recorded by the user. The triangulation approach is improving this by calibrating every pixel individually with camera-based point correspondences. Whereas the previous generation of see-through glasses used simple optics like half-transparent mirrors the latest generation uses more complex optical elements like gratings or free-form light-guides. Minor changes of the viewpoint can cause a viewing ray to pass through the neighboring grating which makes a triangulation of the pixel's virtual position impossible. Another work splits the distortion calibration into a real-world distortion and a display distortion part since the viewing rays pass through different optical elements.This work proposes a new method which calibrates the display pixels individually based on the pixels' viewing rays instead of triangulating their viewpoint-dependent virtual positions. Furthermore, the proposed method overcomes the need for two separate distortion calibrations. The real world distortion is implicitly taken into account by calibrating the cameras which are placed behind the displays of the glasses while looking through the optics. The cameras together with three translation stages enable an automatic and therefore interaction-free calibration. All cameras are calibrated using non-parametric calibration techniques. An optimal frustum and distortion maps for different viewpoints are calculated in order to render and correct a scene with high accuracy in real-time.It is shown that a triangulation-based approach only succeeds when the optics of the glasses are not too complex. The results of the proposed method show mean angular errors of only 0.88Â arcmin or 0.13Â mm at a distance of 500Â mm which is more accurate than previous approaches.
Journal: Computers & Graphics - Volume 64, May 2017, Pages 51-61