International Journal of Advances in Computer Science and Its Applications
Author(s) : MAKOTO SATO , NAPHAT RATTANATHAWORNKITI , TEERASITKASETKASEM , THITIPORN CHANWIMALUANG
This paper proposes a new desktop virtual reality system that uses only a single camera to track and calculate the distance between a viewer’s eyes and the camera in order to produce proper viewing images. The system estimates a distance between the eyes and the camera by measuring the distance between irises of left and right eyes. The system has the initial step to calibrate distance between left and right irises of the viewer and use these irises as the reference points to find the distance between the eyes and the screen. However, since this system uses only two eyes, it cannot cope with the rotation on the axes parallel to the camera. The FaceAPIlibrary  is employed as the orientation estimator since the FaceAPI uses the sophisticating image processing algorithm to detect and measure human face orientation and distance with respect to the camera. However, since not all human faces have equal size, the estimated distance between a human face and camera from the FaceAPI library is usually inaccurate. Nevertheless, we can use the orientation information derived from the FaceAPI to adjust distances between left and right irises of a viewer. Our experiment shows that the distance between two irises from the system is more accurate the system using only the FaceAPI.