HUMAN COMPUTER INTERFACE THROUGH BRAIN-HAPTICS CH.Sreekanth ¾ B.Tech (y5it043)
[email protected] 1. Abstract: Brain plays major role in the human life. Brain instructs all parts of the human system to do their work. It emits electrical signals and passes to different organs of the body like hand fingers etc. Nerves are the carries or the conductors of these signals. These signals can be picked up using electrodes and can be analyzed to know what that signal represents. Here we propose a system, which can recognize these signals from the brain, decode it and perform the specified operation as initiated by the brain and generate the feedback force. This is to help for those who are partially physically handicap or paralyzed, those who cant do all things by themselves can do all the things by themselves as their brain is not in co-ordination with the body. This system picks such signals, amplifies it, converts it into digital, analyzes it using neural network and generates actuating signals to external hardware. 2. Key Words: Haptics, Brain computer interface (BCI), force feedback, Electro-Encephalogram (EEG). 3. Introduction: A human –computer interaction device that implements the concept “think and make it happen without any physical efforts” is called a brain computer interface [3] A brain- computer interface is a staple of science fiction writing. In its earliest incarnation no mechanism was thought necessary, as the technology seemed so far fetched that no explanation was likely. As more and more become know about the brain,
1
however the possibility has become more real and the science fiction more sophisticated. It is called cognitive engineering. The last 6 years have witnessed a rapid-growing body of research and technique development involving detecting human brain responses and putting these techniques to appropriate users to help people with debilitating diseases or who are disabled in some way –the so called brain computer interface. The main difference between BCI technique and those studied in more common human-computer interface (HCI) task lies in not relying on any sort of muscular response but only detectable signals representing response or intentional brain activity. This places increased emphasis on detecting mechanism for finding out if and when the desired responses occurred. Some preliminary works [2] is being done on sensing neurons on silicon transformers and on growing neurons into neural network top of computer chips. 4. Brain mapping and decoding: In this part electrodes are immersed in the cortex region of the brain through which electrical activity of the brain can be detected by a communication channel called electroencephalogram. These signals are analog in nature and must be converted into the digital signals to interface this with computer. The signals generated by the brain are very feeble. Very sensitive electrodes are placed on different regions of scalp pick these signals and amplify using precision amplifier and amplify to the level required for ADC. Figure 1 shows the arrangement of electrode on human scalp. Figure 1: User interfacing with computer Figure 2 shows a block diagram for the interfacing network with computer. The signals are received from electrode immersed in cortex region of brain. Electrodes are
2
specially immersed in cortex region because the neurons in the motor cortex become desynchronized when we plan movements. These movements are called movement related desynchronization (MRD). But these signals are tiny rarely bigger than a few tens of microvolts and are often buried beneath other signals. Fast ADC converts these signals to digital format for processing in computer.
Figure 2: Brain mapping. Therefore there is a need to use advanced pattern multilayer back propagation neural network, which can be trained offline to recognize the signal pattern. After the recognition of brain signals sent by the neural network these are fed to computer through USB communication 5. USB communication: What’s need for research system is more generic way to command data rapidly in and out of host serial input output is an obvious candidate, since hardware connection are available on most computer and many microprocessors and drivers are common. Its early protocol RS-232 could not manage 500-1000 hertz update rates data that haptic display requires. However universal serial bus (USB) offers viable answer. So called “slow USB 1.1” can be used to achieve our requirements while transferring 8 bytes in each direction 3
and increasingly available USB 2.0 can poll about 8x faster Conventional microprocessor such as micro chip’s PIC product line currently have USB 1.1 support in some models easing development burden and PIC USB 2.0 support is anticipated shortly. 6. Neural Networks: Neural Network is a massively parallel-distributed processor made up of simple processing units, which has a natural propensity for storing experimental knowledge and making it available for use. It resembles the brain in two respects 1. Knowledge is acquired by the network from its environment through a learning process. 2. Inter-neuron connection strengths, known as synaptic weights, are used to store the acquired knowledge. Fig. 3. Shows the block diagram of a human brain. It accepts the stimulus signals from sensory organs and generates response. A learning algorithm modifies the synaptic weights of the network in an orderly fashion to perform the learning process and acquire knowledge. Stimulus
Receptors
6.1 Model of a Neuron Neural networks are based on
Neural net
Effectors
Figure 3: Human Brain
Response
combinations
of
elementary information processors called neurons. It is the basis for the designing artificial neural network. Each neuron takes a number of inputs v1, v2 …vn and generate a single output x. Each input is associated with a weight and the output is then a function of the weighted sum of input. The output function may be discrete or continuous depending the network in use. Figure4. Shows a neuron. v1 v2 Inputs
Σ
x
Output
v3 v4 Figure.4 Neuron 4
The back-propagation proceeds by comparing the output of the network to that expected and computing an error measured based on sum of square differences. This is then minimized using gradient descent by altering the weights of the network. Denoting a member of the training set by si, the actual output by yi, and the desired outputs by wi, the error (the summing square difference over the entire training set) is given by n
E=∑ i =1
n
∑(y j =1
i j
− w ij ) 2
- Eqn.2
Back-propagation algorithm is based on the error-correction learning rule.
Error-
correction learning consists of two passes through the different layers of the network: A forward pass and the backward pass. In the forward pass, an activity pattern is applied to the sensory nodes of the network, and its effect propagates through the network, layer by layer. Finally, a set of outputs is produced as the actual response of the network. During the forward pass the synaptic weights of the network are all fixed. During the backward pass, on the other hand, the synaptic weights are adjusted in accordance with an errorcorrection rule. The actual response of the network is subtracted from the desired response to produce an error signal. This error signal is then propagated backward through the network, against the direction of synaptic connections. Back propagation Neural Networks has been used to classify the patterns in the adaptive pattern recognition system, which is part of the OCR. 7. Computer interface with hap tic device and force feedback system: The signal fed to the computer by USB is analyzed and computer is programmed to interface with hap tic device. The interfacing of the computer with haptic device can be represented as shown in figure.8.As shown in figure the signals from the brain are fed to the ADC, which converts the analog signals from brain to digital signals for the computer. The communication between computer and ADC is through USB. In computer, the controlling device and neural network gives command to the haptic device for actuating output and haptic device gives feedback to the user in the form of tactile sensing.
5
Figure 8: Computer interface with haptic device.
8. System representation: The complete system is depicted in figure 9. The signals from the brain are fed to the amplifier. These signals are obtained from the electrodes embedded in brain and those placed on the scalp. These signals are amplified up to the required level and fed to the feature extractor (sampling and holding circuit). This feature extractor selects the required signals and fed into ADC. The signals from ADC are fed to controlling devices. The application program in computer accepts these signals through USB. Generated signals are fed to the haptic device connected to the computer.
6
Figure 9: The complete system with computer and haptic device. 9. Limitations: 1. The place where signals are generated, in cortex part of brain in human beings differs from person to person as each person is capable of reasoning differently with different intensities. 2. User must be trained well in all aspect in order to work mentally which is a tougher job. 3. As electrodes are immersed in brain there can be chances of adverse effect, which can cause severe problem to the user. 4. The limitation of speed always hinders such system to be efficient in their work. 5. This system is helpful only to partially paralyzed people and not for completely disabled ones as there is no efficient way of feedback
10. Conclusion: Finally this system is capable of providing the comfort required to the disabled people to do their desired work. A person in this regard can work efficiently like all other people. 11. References: 1. www.robots.ox.ac.uk 2. Toby Howard, Brain Computer Interface, Personal computer world magazine Febuary-99 3. Andrew Wright, Brain Computer Interface, nac_ccc.si.cc.willams.edu. 4. www.haptic-e.org 5. www.forcefeedback.com 6. G.M.Lingaraju, Sandeep.C.Senan “Virtual reality and haptic”, International conference ICSCI-2004, FEB 12-15, 2004 at Hyderabad. 7. www.sensable.com/community/index.htm
7
8. Workshop on hap tic technology by Sandeep.C.Senan, Sripad Rao and Shashank.B at JNNCE Shimoga. 9. Shreyas.I.Shrigandha, Prabhanjana.P “6th sense Haptic device” TECHZONE2005, JNNCE, Shimoga
8