Skip to main content
Log in

Object acquisition and selection using automatic scanning and eye blinks in an HCI system

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

This paper presents an object acquisition and selection approach in human computer interaction systems. In this approach, objects placed over computer screen are automatically scanned and the user performs voluntary eye blinks for object selection when the focus comes over the object of interest. Here, scanning means moving the focus over objects placed on the computer screen one by one and the scanning time is the time taken to move focus from one object to the next object. The user is not required to perform any physical movement, the moving part is only the eye lids. A low cost webcam and MATLAB software with computer vision toolbox are required to implement the proposed approach. The performance of the proposed approach has been compared with the Camera Mouse for selection of text and graphic objects. The Camera Mouse utilizes facial feature tracking for mouse cursor control and dwell time for object selection. Three experiments were performed for evaluation of the proposed method in which ten healthy users voluntarily participated. The proposed method has given significantly better performance than the Camera Mouse when selection of text objects was performed in an html file. For selection of graphic objects placed on computer screen, where page scrolling is not required, no significant difference has been found in the performance of both the systems. The proposed method has also been evaluated for performing mouse analogous operations using eye blinks and a performance comparison has been made with state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Hegde VN, Ullagaddimath RS, Kumuda S (2016) Low cost eye based human computer interface system. In: Proceedings of 2016 IEEE annual India conference (INDICON), pp 1–6

  2. Zhao XA, Guestrin ED, Sayenko D et al (2012) Typing with eye-gaze and tooth-clicks. In: Proceedings of ETRA’12 the symposium on eye tracking research and applications, pp 341–344

  3. Guillamet TPA, Teixidó MTM, Viso AF, Palacín CRJ (2013) Implementation of a robust absolute virtual head mouse combining face detection, template matching and optical flow algorithms. Telecommun Syst 52:1479–1489. https://doi.org/10.1007/s11235-011-9625-y

    Article  Google Scholar 

  4. Hao Z, Lei Q (2008) Vision-based interface: using face and eye blinking tracking with camera. In: Proceedings of IITA’08 second international symposium on intelligent information technology applications, pp 308–310

  5. Betke M, Gips J, Fleming P (2002) The camera mouse: visual tracking of body features to provide computer access for people with severe disabilities. IEEE Trans Neural Syst Rehabil Eng 10:1–10. https://doi.org/10.1109/TNSRE.2002.1021581

    Article  Google Scholar 

  6. Quain R, Khan MM (2014) Portable tongue-supported human computer interaction system design and implementation. In: Proceedings of 2014 36th annual international conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp 6302–6307

  7. Mackenzie IS, Ashtiani B (2011) BlinkWrite: efficient text entry using eye blinks. Univ Access Inf Soc 10:69–80. https://doi.org/10.1007/s10209-010-0188-6

    Article  Google Scholar 

  8. Biswas P, Langdon P (2013) A new interaction technique involving eye gaze tracker and scanning system. In: Proceedings of the 2013 conference on eye tracking, pp 67–70

  9. Grauman K, Betke M, Lombardi J et al (2003) Communication via eye blinks and eyebrow raises: video-based human-computer interfaces. Univ Access Inf Soc 2:359–373

    Article  Google Scholar 

  10. Yang SW, Lin CS, Lin SK, Lee CH (2013) Design of virtual keyboard using blink control method for the severely disabled. Comput Methods Programs Biomed 111:410–418. https://doi.org/10.1016/j.cmpb.2013.04.012

    Article  Google Scholar 

  11. Gips J, Olivieri P, Tecce J (1993) Direct control of the computer through electrodes placed around the eyes. In: Proceedings of the fifth international conference on human-computer interaction: applications and case studies, pp 630–635

  12. Deepika SS, Murugesan G (2015) A novel approach for human computer interface on eye movements for disabled people. In: Proceedings of 2015 IEEE international conference on electrical, computer and communication technologies (ICECCT 2015)

  13. Kumar M, Paepcke A, Winograd T (2007) EyePoint: practical pointing and selection using gaze and keyboard. In: CHI’07 proceedings of the SIGCHI conference on human factors in computing systems, pp 421–430

  14. Huckauf A, Urbina MH (2011) Object selection in gaze controlled systems: what you don’t look at is what you get. ACM Trans Appl Perceptron 8:1–14. https://doi.org/10.1145/1870076.1870081

    Article  Google Scholar 

  15. Singh JV, Prasad G (2015) Enhancing an eye-tracker based human-computer interface with multi-modal accessibility applied for text entry. Int J Comput Appl 130:16–22

    Google Scholar 

  16. Urbina MH, Weimar B (2010) Alternatives to single character entry and dwell time selection on eye typing. In: Proceedings of ETRA’10 the 2010 symposium on eye-tracking research and applications, pp 315–322

  17. Vertegaal R (2008) A Fitts’ law comparison of eye tracking and manual input in the selection of visual targets. In: Proceedings of ICMI’08 the 10th international conference on multimodal interfaces, pp 241–248

  18. Arai K, Mardiyanto R (2011) Eye-based HCI with full specification of mouse and keyboard using pupil knowledge in the gaze estimation. In: Proceedings of 2011 8th international conference on information technology: new generations, ITNG 2011, pp 423–428

  19. Kraichan C, Pumrin S (2014) Face and eye tracking for controlling computer functions. In: Proceedings of 2014 11th international conference on electrical engineering/electronics, computers, telecommunication and information technology. https://doi.org/10.1109/ecticon.2014.6839834

  20. Siriluck W, Kamolphiwong S, Kamolphiwong T (2007) Blink and click. In: Proceedings of the 1st international convention on rehabilitation engineering and assistive technology: in conjunction with 1st Tan Tock Seng Hospital Neurorehabilitation meeting, pp 43–46

  21. Huckauf A, Urbina MH, Weimar B (2008) On object selection in gaze controlled environments. J Eye Mov Res 2:1–7

    Google Scholar 

  22. Surakka V, Illi M, Isokoski P (2004) Gazing and frowning as a new human–computer interaction technique. ACM Trans Appl Perception 1:40–56

    Article  Google Scholar 

  23. Yang S, Lin C, Lin S, Lee C (2013) Design of virtual keyboard using blink control method for the severely disabled. Comput Methods Programs Biomed 111:410–418. https://doi.org/10.1016/j.cmpb.2013.04.012

    Article  Google Scholar 

  24. Posusta A, Sporka AJ, Polacek O et al (2015) Control of word processing environment using myoelectric signals. J Multimodal User Interfaces 9:299–311

    Article  Google Scholar 

  25. Gizatdinova Y, Spakov O, Surakka V (2012) Comparison of video-based pointing and selection techniques for hands-free text entry. In: Proceedings of AVI’12 the international working conference advanced visual interfaces, pp 132–139

  26. Rantanen V, Verho J, Lekkala J et al (2012) The effect of clicking by smiling on the accuracy of head-mounted gaze tracking. In: Proceedings of ETRA’12 the symposium on eye tracking research and applications, pp 345–348

  27. Paivi M, MacKenzie IS, Aula A, Raiha KJ (2006) Effects of feedback and dwell time on eye typing speed and accuracy. Univ Access Inf Soc 5:199–208. https://doi.org/10.1007/s10209-006-0034-z

    Article  Google Scholar 

  28. Zhang X, Ren X, Zha H (2008) Improving eye cursor’s stability for eye pointing tasks. In: Proceedings of SIGCHI conference on human factors in computing systems, pp 525–534

  29. Kim H, Suh KH, Lee EC (2017) Multi-modal user interface combining eye tracking and hand gesture recognition. J Multimodal User Interfaces 11:241–250

    Article  Google Scholar 

  30. Lopez-basterretxea A, Mendez-zorrilla A, Zapirain BG (2015) Eye/head tracking technology to improve HCI with iPad applications. MDPI Sens J 15:2244–2264. https://doi.org/10.3390/s150202244

    Article  Google Scholar 

  31. Konig WA, Radle R, Reiterer H (2010) Interactive design of multimodal user interfaces. J Multimodal User Interfaces 3:197–213

    Article  Google Scholar 

  32. Heo J, Yoon H, Park KS (2017) A novel wearable forehead EOG measurement system for human computer interfaces. MDPI Sens (Basel) 17:1–14. https://doi.org/10.3390/s17071485

    Article  Google Scholar 

  33. Usakli AB, Gurkan S, Aloise F et al (2010) On the use of electrooculogram for efficient human computer interfaces. Comput Intell Neurosci 2010:1–5. https://doi.org/10.1155/2010/135629

    Article  Google Scholar 

  34. Krolak A, Strumillo P (2012) Eye-blink detection system for human-computer interaction. Univ Access Inf Soc 11:409–419. https://doi.org/10.1007/s10209-011-0256-6

    Article  Google Scholar 

  35. Venkataramanan S, Prabhat P, Chaudhury SR et al (2005) Biomedical instrumentation based on electrooculogram (EOG) signal processing and application to a hospital alarm system. In: Proceedings of the 2005 international conference on intelligent sensing and information processing, pp 535–540

  36. Kumar D, Poole E (2002) Classification of EOG for human computer interface. In: Proceedings of the second joint EMBS/BMES conference, pp 64–67

  37. Pander T (2008) An application of detection function for the eye blinking detection. In: Proceedings of human-computer systems interaction part of the advances in intelligent and soft-computing—book series, pp 181–191

  38. Drutarovsky T, Fogelton A (2014) Eye blink detection using variance of motion vectors. In: Proceedings of ECCV 2014 workshop on computer vision, pp 1–12

  39. Divjak M, Bischof H (2009) Eye blink based fatigue detection for prevention of computer vision syndrome. In: Proceedings of IAPR conference on machine vision applications, pp 350–353

  40. Danisman T, Bilasco IM, Djeraba C, Ihaddadene N (2010) Drowsy driver detection system using eye blink patterns. In: Proceedings of international conference on machine and web intelligence. Algiers, Algeria, 3–5 October, pp 230–233

  41. Won OhL, Eui Chul L, Kang Ryoung P (2010) Blink detection robust to various facial poses. J Neurosci Methods 193:356–372. https://doi.org/10.1016/j.jneumeth.2010.08.034

    Article  Google Scholar 

  42. Missimer E, Betke M (2010) Blink and wink detection for mouse pointer control. In: Proceedings of PETRA’10 the 3rd international conference on pervasive technologies related to assistive environments. Samos, Greece, June 23–25, pp 1–8

  43. Chareonsuk W, Kanhaun S, Khawkam K, Wongsawang D (2016) Face and eyes mouse for ALS patients. In: Proceedings of 2016 fifth ICT international student project conference (ICT-ISPC), pp 1–4

  44. Gantyala S, Godad W, Phadnis N (2016) Controlling mouse events using eye blink. Int J Adv Res Comput Commun Eng 5:754–756. https://doi.org/10.17148/IJARCCE.2016.53182

    Article  Google Scholar 

  45. Singh H, Singh J (2018) Real-time eye blink and wink detection for object selection in HCI systems. J Multimodal User Interfaces 12:55–65. https://doi.org/10.1007/s12193-018-0261-7

    Article  Google Scholar 

  46. Palleja T, Rubión E, Tresanchez M, Fernández A (2008) Using the optical flow to implement a relative virtual mouse controlled by head movements. J Univ Comput Sci 14:3127–3141

    Google Scholar 

  47. Pimplaskar D, Nagmode MS, Borkar A (2013) Real time eye blinking detection and tracking using OpenCV. Int J Eng Res Appl 3:1780–1787

    Google Scholar 

  48. Ozok AA (2009) Survey design and implementation in HCI. In: Sears A, Jacko JA (eds) The human computer interaction handbook: fundamentals, evolving technologies and emerging applications, 2nd edn. CRC Press, Boca Raton, pp 1151–1169. https://doi.org/10.1201/9781410615862

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hari Singh.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Singh, H., Singh, J. Object acquisition and selection using automatic scanning and eye blinks in an HCI system. J Multimodal User Interfaces 13, 405–417 (2019). https://doi.org/10.1007/s12193-019-00303-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-019-00303-0

Keywords

Navigation