Skip to main content
Log in

EEG-EOG based Virtual Keyboard: Toward Hybrid Brain Computer Interface

  • Review
  • Published:
Neuroinformatics Aims and scope Submit manuscript

Abstract

The past twenty years have ignited a new spark in the research of Electroencephalogram (EEG), which was pursued to develop innovative Brain Computer Interfaces (BCIs) in order to help severely disabled people live a better life with a high degree of independence. Current BCIs are more theoretical than practical and are suffering from numerous challenges. New trends of research propose combining EEG to other simple and efficient bioelectric inputs such as Electro-oculography (EOG) resulting from eye movements, to produce more practical and robust Hybrid Brain Computer Interface systems (hBCI) or Brain/Neuronal Computer Interface (BNCI). Working towards this purpose, existing research in EOG based Human Computer Interaction (HCI) applications, must be organized and surveyed in order to develop a vision on the potential benefits of combining both input modalities and give rise to new designs that maximize these benefits. Our aim is to support and inspire the design of new hBCI systems based on both EEG and EOG signals, in doing so; first the current EOG based HCI systems were surveyed with a particular focus on EOG based systems for communication using virtual keyboard. Then, a survey of the current EEG-EOG virtual keyboard was performed highlighting the design protocols employed. We concluded with a discussion of the potential advantages of combining both systems with recommendations to give deep insight for future design issues for all EEG-EOG hBCI systems. Finally, a general architecture was proposed for a new EEG-EOG hBCI system. The proposed hybrid system completely alters the traditional view of the eye movement features present in EEG signal as artifacts that should be removed; instead EOG traces are extracted from EEG in our proposed hybrid architecture and are considered as an additional input modality sharing control according to the chosen design protocol.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16

Similar content being viewed by others

References

  • Al-Haddad AA, Sudirman R, Omar C. (2011). Guiding wheelchair motion based on EOG signals using tangent bug algorithm. Proc. - CIMSim 2011 3rd Int. Conf. Comput. Intell. Model. Simul., p. 40–5. doi:https://doi.org/10.1109/CIMSim.2011.17.

  • Al-Haddad, A., Sudirman, R., Omar, C., Hui, K. Y., & Jimin, M. R. (2012). Wheelchair Motion Control Guide Using Eye Gaze and Blinks Based on PointBug Algorithm. Third Int Conf Intell Syst Model Simul, 2012, 37–42.

  • Allison, B., Millán, J. d. R., Nijholt, A., Dunne, S., Leeb, R., Whitmer, D., Poel, M., & Neuper, C.. (2012). Future BNCI: A Roadmap for Future Direction in Brain / Neuronal Computer Interaction Research. Futur Dir Brain/Neuronal Comput Interact (Future BNCI), 1–255.

  • Ang, A. M. S., Zhang, Z. G., Hung, Y. S., & Mak, J. N. F. A. (2015). user-friendly wearable single-channel EOG-based human-computer interface for cursor control. International IEEE/EMBS Conference on Neural Engineering (NER), 2015, 565–568.

    Google Scholar 

  • Aungsakun, S., Phinyomark, A., & Phukpattaranont, P. (2012). Development of robust electrooculography (EOG)-based human-computer interface controlled by eight-directional eye movements. Int J Phys Sci, 7, 2196–2208.

    Google Scholar 

  • Barea, R., Boquete, L., Mazo, M., & López, E. (2002a). System for assisted mobility using eye movements based on electrooculography. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 10, 209–218.

    Article  Google Scholar 

  • Barea, R., Boquete, L., Mazo, M., & López, E. (2002b). Wheelchair guidance strategies using EOG. Journal of Intelligent and Robotic Systems, 34, 279–299.

    Article  Google Scholar 

  • Barea, R., Boquete, L., Ortega, S., López, E., & Rodríguez-Ascariz, J. M. (2012). EOG-based eye movements codification for human computer interaction. Expert Systems with Applications, 39, 2677–2683.

    Article  Google Scholar 

  • Barea, R., Boquete, L., Mazo, M., López, E., & Bergasa, L. M. (n.d.). E. O. G. guidance of a wheelchair using neural networks.

  • Borghetti, D., Bruni, A., Fabbrini, M., Murri, L., & Sartucci, F. (2007). A low-cost interface for control of computer functions by means of eye movements. Computers in Biology and Medicine, 37, 1765–1770.

    Article  CAS  Google Scholar 

  • Bozinovski, S. (2014). Twenty-Fifth Anniversary of the First EOG Controlled Robot. Journal of Computer Science & Systems Biology, 7, 7–9.

    Article  Google Scholar 

  • Brown, M., Marmor, M., Vaegan, Z. E., Brigell, M., & Bach, M. (2006). ISCEV Standard for Clinical Electro-oculography (EOG) 2006. Documenta Ophthalmologica, 113, 205–212.

    Article  Google Scholar 

  • Bulling, A., & Roggen, D. (2011). Recognition of visual memory recall processes using eye movement analysis. Proc 13th Int Conf Ubiquitous Comput – UbiComp, 11, 455.

  • Bulling, A., Roggen, D., & Tröster, G. (2008). EyeMote - Towards context-aware gaming using eye movements recorded from wearable electrooculography. Lecture Notes in Computer Science (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), 5294 LNCS, 33–45.

    Google Scholar 

  • Bulling, A., Roggen, D., & Tröster, G. (2009a). Wearable EOG goggles: Seamless sensing and context-awareness in everyday environments. Journal of Ambient Intelligence and Smart Environments, 1, 157–171.

    Google Scholar 

  • Bulling, A., Bulling, A., Laboratory, W. C., Laboratory, W. C., Roggen, D., & Roggen, D. (2009b). Wearable EOG Goggles: Eye-Based Interaction in Everyday Environments. Wear, 3259–3264.

  • Carpenter, R. R. H. S. (1988). Movements of the Eye (2nd ed.). London: Pion.

    Google Scholar 

  • Cecotti, H. (2011). Spelling with non-invasive Brain-Computer Interfaces - Current and future trends. Journal of Physiology, Paris, 105, 106–114.

    Article  Google Scholar 

  • Development of EOG based human machine interface control system for motorized wheelchair. 2014 Annual International Conference on Emerging Research Areas: Magnetics, Machines and Drives, AICERA/iCMMD 2014 – Proceedings.

  • Chen Y. (2003) Design and evaluation of a human-computer interface based on electrooculography. Master’s thesis, Case Western Reserve Univ.

  • Chen YCY, Newman WS. (2004). A human-robot interface based on electrooculography. IEEE Int Conf Robot Autom 2004 Proceedings ICRA ’04 2004;1:243–248.

  • Choi S, Cichocki A, Park HM. (2005). Blind source separation and independent component analysis: A review Process.

  • Crea, S., Nann, M., Trigili, E., Cordella, F., Baldoni, A., Badesa, F. J., Catalán, J. M., Zollo, L., Vitiello, N., Aracil, N. G., & Soekadar, S. R. (2018). Feasibility and safety of shared EEG/EOG and vision-guided autonomous whole-arm exoskeleton control to perform activities of daily living. Scientific Reports, 8(1), 10823.

    Article  Google Scholar 

  • Deepika, S. S. (2015). Murugesan G. A novel approach for Human Computer Interface based on eye movements for disabled people. Electr. Comput. Commun. Technol. (ICECCT). 2015 IEEE International Conference on Electrical, Computer and Communication Technologies, 2015, 1–3.

    Google Scholar 

  • Desai, Y. S. (2013). Natural Eye Movement & its application for paralyzed patients. International Journal of Engineering Trends and Technology(IJETT), 4, 679–686.

    Google Scholar 

  • Emotiv Systems. (2014). Emotiv epoc testbench specifications. Brain Comput Interface Sci Context EEG - Emot 2011.

  • English E, Hung A, Kesten E, Latulipe D, Jin Z. (2013). EyePhone: A mobile EOG-based Human-Computer Interface for assistive healthcare. Int IEEE/EMBS Conf Neural Eng NER:105–8.

  • Farwell, L. A., & Donchin, E. (1988). Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalography and Clinical Neurophysiology, 70, 510–523.

    Article  CAS  Google Scholar 

  • Fazel-Rezai, R., Allison, B. Z., Guger, C., Sellers, E. W., Kleih, S. C., & Kübler, A. (2012). P300 brain computer interface: current challenges and emerging trends. Front Neuroeng, 5, 14. https://doi.org/10.3389/fneng.2012.00014.

    Article  PubMed  PubMed Central  Google Scholar 

  • Gips J, Olivieri P. (1996). EagleEyes: An eye control system for persons with disabilities. The Eleventh International Conference on Technology and Persons with Disabilities Angeles, California, 1996, 1–15.

  • Gu, J. J., Meng, M., Cook, A., & Liu, P. X. (2006). Design, sensing and control of a robotic prosthetic eye for natural eye movement. Applied Bionics and Biomechanics, 3, 29–41.

    Article  Google Scholar 

  • Gul JJ, Meng M, Cook A, Faulkner MG, Liu PX. (2001). Sensing and Control of a Robotic Prosthetic Eye for Ocular Implant. Proc 2001 IEEE/RSJ Int Conf Intell Robot Syst:2166–71.

  • Hassan, N. M. M., & Mansor, W. (2014). Detection of eye movements for controlling a television. Proc - IEEE 10th Int Colloq Signal Process Its Appl CSPA, 2014(2014), 257–260.

    Google Scholar 

  • Hong, K. S., & Khan, M. J. (2017). Hybrid Brain–computer interface techniques for improved classification accuracy and increased number of commands: a review. Frontiers in Neurorobotics, 11, 35.

    Article  Google Scholar 

  • Hori, J., Sakano, K., Miyakawa, M., & Saitoh, Y. (2006). Eye movement communication control system based on EOG and voluntary eye blink. Proc 11th Int Conf Comput Help People with Spec Needs, 950–953.

  • Hwang, H. J., Kim, S., Choi, S., & Im, C. H. (2013). EEG-based brain-computer interfaces: A Thorough literature survey. International Journal of Human-Computer Interaction, 29(12), 814–826.

    Article  Google Scholar 

  • Iáñez, E., Úbeda, A., Azorín, J. M. (2011). Multimodal human-machine interface based on a brain-computer interface and an electrooculography interface. Proceeding of Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBS), 4572–4575.

  • Iáñez, E., Úbeda, A., Azorín, J. M., & Perez-Vidal, C. (2012). Assistive robot application based on an RFID control architecture and a wireless EOG interface. Robotics and Autonomous Systems, 60, 1069–1077.

    Article  Google Scholar 

  • Iáñez, E., Azorin, J. M., & Perez-Vidal, C. (2013). Using eye movement to control a computer: a design for a lightweight electro-oculogram electrode array and Computer Interface. PLoS One, 8.

  • Jason J., Gu MM. (2001). A study of natural eye movement deterction and ocular implant movement control using processed eog signal. IEEE, Int Conf Robot Autom Seul, Korea;Mayo:1555–60.

  • Jiang, J., Zhou, Z., Yin, E., Yu, Y., & Hu, D. (2014). Hybrid Brain-Computer Interface (BCI) based on the EEG and EOG signals. Biomedical Materials and Engineering, 24, 2919–2925.

    PubMed  Google Scholar 

  • Jiao, Y., Zhang, Y., Wang, Y., Wang, B., Jin, J., & Wang, X. (2018). A novel multilayer correlation maximization model for Improving CCA-Based Frequency Recognition in SSVEP Brain-Computer Interface. International Journal of Neural Systems., 28(4).

  • Jo Y-H. (2013). Web of Science. Thomson Reuters.

  • Kherlopian, A. R., Gerrein, J. P., Yue, M., Kim, K. E., Kim, J. W., Sukumaran, M., et al. (2006). Electrooculogram based system for computer control using a multiple feature classification model. Conference Proceedings: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, 1, 1295–1298.

    Google Scholar 

  • Kirbiš, M., & Kramberger, I. (2009). Mobile device for electronic eye gesture recognition. IEEE Transactions on Consumer Electronics, 55, 2127–2133.

    Article  Google Scholar 

  • Koo, B., Nam, Y., & Choi, S. (2014). A hybrid EOG-P300 BCI with dual monitors. Int. Winter Work. Brain-Computer Interface, BCI, 2014, 2014.

    Google Scholar 

  • Kuo, C. H., Chan, Y. C., Chou, H. C., Siao, J. W. (2009). Eyeglasses based electrooculography human-wheelchair interface. Conf. Proc. - IEEE International Conference of Systems, Man and Cybernetics, 4746–4751.

  • Lacourse, J. R., & Hludik, F. C. (1990). An eye movement communication-control system for the disabled. IEEE Transactions on Biomedical Engineering, 37, 1215–1220.

    Article  CAS  Google Scholar 

  • Lee, T. W., Girolami, M., & Sejnowski, T. J. (1999). Independent component analysis using an extended infomax algorithm for mixed subgaussian and supergaussian sources. Neural Computation, 11, 417–441.

    Article  CAS  Google Scholar 

  • Lee, M.H., Williamson, J., Won, D.O., Fazli, S. and Lee, S.W. (2018). A High Performance Spelling System based on EEG-EOG Signals with Visual Feedback. IEEE Transactions on Neural Systems and Rehabilitation Engineering.

  • Lledó, L. D., Úbeda, A., Iáñez, E., & Azorín, J. M. (2013). Internet browsing application based on electrooculography for disabled people. Expert Systems with Applications, 40, 2640–2648.

    Article  Google Scholar 

  • Lopez A, Rodriguez I, Ferrero FJ, Valledor M, Campo JC. (2014). Low-cost system based on electro-oculography for communication of disabled people. 2014 IEEE 11th Int Multi-Conference Syst Signals Devices, SSD 2014:1–6.

  • Ma, J., Zhang, Y., Cichocki, A., & Matsuno, F. (2015). A novel EOG/EEG hybrid human-machine interface adopting eye movements and ERPs: Application to robot control. IEEE Transactions on Biomedical Engineering, 62, 876–889.

    Article  Google Scholar 

  • Majaranta P, Räihä K-J, (2002). Techniques a CMSIG on CG and I, Interaction a CMSIG on C-H, Machinery A for C. Twenty Years of Eye Typing: Systems and Design Issues. Eye Track Res Appl Symp:15–22.

  • Malik, A. Q., & Ahmad, J. (2007). Retina Based Mouse Control (RBMC). International Journal of Computer, Information, Systems and Control Engineering, 1, 1987–1991.

    Google Scholar 

  • Millán, J. D. R., Renkens, F., Mouriño, J., & Gerstner, W. (2004). Noninvasive brain-actuated control of a mobile robot by human EEG. IEEE Transactions on Biomedical Engineering, 51, 1026–1033.

    Article  Google Scholar 

  • Nathan DS, Vinod AP, Thomas KP. (2012). An electrooculogram based assistive communication system with improved speed and accuracy using multi-directional eye movements. 2012 35th International Conference of Telecommun and Signal Processing TSP 2012 - Proc, 2012, 554–558.

  • Ning, B., Li, M., Liu, T., Shen, H., Hu, L., & Fu, X. (2012). Human brain control of electric wheelchair with eye-blink electrooculogram signal. International Journal of Intelligent Robotics and Applications, 579–88.

  • Orhan U. (2014) RSVP keyboard: an EEG based BCI typing system with context information fusion. Northeastern University.

  • Pfurtscheller, G. (2010). The hybrid BCI. Frontiers in Neuroscience, 4, 1–11.

    Google Scholar 

  • Pingali TR, Dubey S, Shivaprasad A, Varshney A, Ravishankar S, Pingali GR, et al. (2014). Eye-gesture controlled intelligent wheelchair using Electro-Oculography. IEEE International Symposium on Circuits and Systems, p. 2065–8.

  • Pinheiro, J. C. G., Naves, E. L. M., Pino, P., Losson, E., Andrade, A. O., & Bourhis, G. (2011). Alternative communication systems for people with severe motor disabilities: a survey. Biomedical Engineering Online, 10, 31–58.

    Article  Google Scholar 

  • Postelnicu, C. C., & Talaba, D. (2013). P300-based brain-neuronal computer interaction for spelling applications. IEEE Transactions on Biomedical Engineering, 60, 534–543.

    Article  Google Scholar 

  • Postelnicu, C. C., Talaba, D., & Toma, M. I. (2011). Controlling a robotic arm by brainwaves and eye movement. IFIP Adv. Inf. Commun. Technol, 349 AICT, 157–164.

    Article  Google Scholar 

  • Postelnicu, C. C., Girbacia, F., & Talaba, D. (2012). EOG-based visual navigation interface development. Expert Systems with Applications, 39, 10857–10866.

    Article  Google Scholar 

  • Punsawad, Y., Wongsawat, Y., & Parnichkun, M. (2010). Hybrid EEG-EOG brain-computer interface system for practical machine control. 2010 Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. EMBC, 10, 1360–1363.

    Article  Google Scholar 

  • Rokonuzzaman M, Ferdous S, Tuhin R. (2012). Design of an Autonomous Mobile Wheel Chair for Disabled Using Electrooculogram (EOG) Signals. Mechatronics.

  • Shaviv, B. D. (2002). The Design and Improvement of an Eye-Controlled. Interface.

  • Singh, H., & Singh, J. (2012). Human Eye Tracking and Related Issues: A Review. International Journal of Scientific and Research Publications, 2(9), 2250–3153.

    Google Scholar 

  • Soltani S, Mahnam A. (2013). Design of a novel wearable human computer interface based on electrooculograghy. 2013 21st Iran Conf Electr Eng ICEE 2013.

  • Suetsugu K, Tagawa Y, Inada T, Shiba N. (2009) FES position control of forearm using EOG. Lect Notes Comput Sci (Including Subser Lect Notes Artif Intell Lect Notes Bioinformatics);5506 LNCS:494–503.

  • Taylor P, Lamti HA, Moncef M, Khelifa B, Gorce P, Alimi AM. (n.d.). Computer Methods in Biomechanics and Biomedical Engineering A brain and gaze-controlled wheelchair.

  • Tecce, J. J., Gips, J., Olivieri, C. P., Pok, L. J., & Consiglio, M. R. (1998). Eye movement control of computer functions. International Journal of Psychophysiology, 29, 319–325.

    Article  CAS  Google Scholar 

  • Teja SSS, Embrandiri SS, Chandrachoodan N, Reddy M. R. (2015). EOG based virtual keyboard. 2015 41st Annu Northeast Biomed Eng Conf NEBEC 2015:1–2.

  • Tibarewala, D. N. (2015). Voluntary eye movement controlled electrooculogram based multitasking graphical user interface. Anwesha Banerjee * Monalisa Pal and Shreyasi Datta Amit Konar, 18, 254–271.

    Google Scholar 

  • Townsend, G., LaPallo, B. K., Boulay, C. B., Krusienski, D. J., Frye, G. E., Hauser, C. K., et al. (2010). A novel P300-based brain-computer interface stimulus presentation paradigm: Moving beyond rows and columns. Clinical Neurophysiology, 121, 1109–1120.

    Article  CAS  Google Scholar 

  • Tsai, J. Z., Lee, C. K., Wu, C. M., Wu, J. J., & Kao, K. P. (2008). A feasibility study of an eye-writing system based on electro-oculography. Journal of Medical Biology and Engineering, 28, 39–46.

    Google Scholar 

  • Usakli, A. B., & Gurkan, S. (2010). Design of a Novel Efficient Human #x2013;Computer Interface: An Electrooculagram Based Virtual Keyboard. Instrum Meas IEEE Trans, 59, 2099–2108.

    Article  Google Scholar 

  • Usakli AB, Gurkan S, Aloise F, Vecchiato G, Babiloni F. (2009) A hybrid platform based on EOG and EEG signals to restore communication for patients afflicted with progressive motor neuron diseases. Proc. 31st Annu. Int. Conf. IEEE Eng. Med. Biol. Soc. Eng. Futur. Biomed. EMBC, 2009, p. 543–6.

  • Vidal, J. J. (1977). Real-time detection of brain events in EEG. Proceedings of the IEEE, 65, 633–641.

    Article  Google Scholar 

  • Wang, H., Li, Y., Long, J., Yu, T., & Gu, Z. (2014). An asynchronous wheelchair control by hybrid EEG-EOG brain-computer interface. Cogn Neurodyn, 8, 399–409.

    Article  Google Scholar 

  • Wang H, Zhang Y, Waytowich NR, Krusienski DJ, Zhou G, Jin J, Wang X, Cichocki A(2016). Discriminative Feature Extraction via Multivariate Linear Regression for SSVEP-Based BCI, IEEE Transactions on Neural Systems and Rehabilitation Engineering, 24, 5.

  • Williams M, Kirsch R. (2005). Feasibility of electroculography as a command interface for a high tetraplegia neural prosthesis. 10th Annu. Conf. Int. Funct. Electr. Stimul. Soc., Montreal, Canada.

  • Williamson, J., Murray-Smith, R., Blankertz, B., Krauledat, M., & Müller, K. R. (2009). Designing for uncertain, asymmetric control: Interaction design for brain-computer interfaces. International Journal of Human Computer Studies, 67, 827–841.

    Article  Google Scholar 

  • Witkowski, M., Cortese, M., Cempini, M., Mellinger, J., Vitiello, N., & Soekadar, S. R. (2014). Enhancing brain-machine interface (BMI) control of a hand exoskeleton using electrooculography (EOG). Journal of Neuroengineering and Rehabilitation, 11, 165.

    Article  Google Scholar 

  • Wolpaw JR, Birbaumer N, McFarland DJ, Pfurtscheller G, Vaughan TM. Brain-computer interfaces for communication and control. Clinical Neurophysiology 2002;113:767–791.

  • Yamagishi K, Hori J, Miyakawa M. (2006) Development of EOG-based communication system controlled by eight-directional eye movements. Annu. Int. Conf. IEEE Eng. Med. Biol. - Proc., p. 2574–7.

  • Young, L. R., & Sheena, D. (1975). Survey of eye movement recording methods. Behavior Research Methods & Instrumentation, 7, 397–429.

    Article  Google Scholar 

  • Yu Zhang, Qibin Zhao, Jing Jin and Xingyu Wang, (2012). A novel BCI based on ERP components sensitive to configural processing of human faces, international Journal of Neural Engineering.

  • Yu Zhang, Guoxu Zhou, Qibin Zhao, Jing Jin, Xingyu Wang, and Andrzej Cichocki, (2013). Spatial-temporal discriminant analysis for ERP-based brain-computer interface, IEEE Transactions on Neural Systems and Rehabilitation Engineering, 21, 2.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mai S. Mabrouk.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hosni, S.M., Shedeed, H.A., Mabrouk, M.S. et al. EEG-EOG based Virtual Keyboard: Toward Hybrid Brain Computer Interface. Neuroinform 17, 323–341 (2019). https://doi.org/10.1007/s12021-018-9402-0

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12021-018-9402-0

Keywords

Navigation