Mühendislik Fakültesi / Faculty of Engineering

Permanent URI for this collectionhttps://hdl.handle.net/11727/1401

Browse

Search Results

Now showing 1 - 2 of 2
  • Item
    Eye Gaze Location Detection Based On Iris Tracking with Web Camera
    (2018) Yildiz, Metin; Yorulmaz, Muhammet; C-7863-2018
    In recent years, there has been an increased interest in human computer interaction systems where web cameras are used as input devices. In this study, a system developed to distinguish eye gaze locations on the screen by referring to the position of the iris part of the eye is introduced. It is aimed to distinguish more eye gaze location compared to the previous works by web cam. K-nearest neighbors classifier was used to detect eye gaze location by the feature vector of the iris center coordinates. As a result of the first experiments with 10 subjects, the 17 different eye gaze locations on the screen have classified with an average of 97.64% accuracy. It has been observed that only the two adjacent points near the center of the screen in the vertical direction are detected incorrectly. It is expected to increase the classification performance ratio by not using these two incorrectly detected points in future studies.
  • Item
    A Novel Gaze Input System Based on Iris Tracking With Webcam Mounted Eyeglasses
    (2021) Yorulmaz, Muhammet; C-7863-2018
    Due to the high cost of eye-tracking systems based on pupillary corneal reflections, efforts to develop a webcam-based eye-tracking system have increased to provide an affordable alternative to disabled people in recent years. However, due to the camera specification and location, ambient light changes and positional changes of the users, the gazing point of the eyes has not yet been determined precisely by such a system. Therefore, only 8 different gaze directions or up to 10 gaze regions could be detected in the previous webcam-based human-computer interaction studies. In this study, a novel gaze input system has been proposed to make the best use of the limited performance of webcam-based eye tracking and offer an economical alternative for disabled people. To reduce the impact of head movements, the webcam has been mounted to an ordinary glasses frame and positioned in front of the eye. For estimation of the gaze regions, a feature-based method (Hough transformation) was used by considering the circular shape of the iris and the contrast between the iris and sclera. The central coordinates of the iris image captured by the webcam were given to the k-nearest neighbor classifier. We performed a series of experiments with 20 subjects to determine the performance of the system and to investigate the effect of ambient light on the system's accuracy. The 23 regions that were gazed at by subjects were determined with an average accuracy of 99.54%. When the ambient light level was reduced by half, the accuracy decreased to 94.74%. As a result, it has been found that the proposed prototype allows more accurate recognition of a larger number of regions on the screen than previous webcam-based systems. It has been observed that system performance decreases if the ambient light is reduced by half.