Disease Prediction using Machine Learning, Deep Learning and Data Analytics

The Fusion of Human-Computer Interaction and Artificial Intelligence Leads to the Emergence of Brain Computer Interaction

Author(s): M. Kiruthiga Devi * .

Pp: 131-145 (15)

DOI: 10.2174/9789815179125124010014

* (Excluding Mailing and Handling)

Abstract

A personal computer may be used, with input devices such as a keyboard, mouse, and joystick serving as an interface between the computers and the human. The euphemistic, physically challenged are unable to use these computer systems, therefore, BCI technology has advanced external applications to be managed without physical movements in order to assist these physically disabled people and address the limitations of HCI. The technological advancement in the field of cognitive neuroscience and brain imaging has enabled it to communicate directly with the human brain instead of using an interface. Rather than generating signals from muscle movements, these systems use brain activity to monitor computers or communication devices. Researchers in the field of Human-Computer Interaction (HCI) look at ways for machines to utilize as many sensory sources as possible. Furthermore, researchers have begun to consider implicit types of data, input that is not specifically performed to instruct a machine to perform a task. Systems can evolve dynamically based on this data in order to assist the user with the task at hand. Here we discussed components of Brain-Computer Interface, its characteristics and challenges. The researchers are attempting to replace conventional classifiers with Convolutional neural networks (CNNs) that would provide a promising advantage in classification. The EEG signals from the brain can be linked seamlessly to mechanical systems via BCI applications, making it a rapidly growing technology that has applications in fields such as Artificial Intelligence and Computational Intelligence.

© 2024 Bentham Science Publishers | Privacy Policy