Brain Gate System1
Brain Gate System1
PAPER
PRESENTATION
ON
BRAINGATE
Presented by
SYSTEM
Narasimha charyulu.p.v Kamalesh rathod 07781A0457 07781A0430 3 rd btech 3 rd btech ECE ECE Email:[email protected] Email:[email protected]
A technique called neurofeedback uses connecting sensors on the scalp to translate brain waves into information a person can learn from. The sensors register different frequencies of the signals produced in the brain. These changes in brain wave patterns indicate whether someone is concentrating or suppressing his impulses, or whether he is relaxed or tense.
NEUROPROSTHETIC DEVICE:
A neuroprosthetic device known as Braingate converts brain activity into computer commands. A sensor is implanted on the brain, and electrodes are hooked up to wires that travel to a pedestal on the scalp. From there, a fiber optic cable carries the brain activity data to a nearby computer.
PRINCIPLE:
"The principle of operation of the BrainGate Neural Interface System is that with intact brain function, neural signals are generated even though they are not sent to the arms, hands and legs. These signals are interpreted by the System and a cursor is shown to the user on a computer screen that provides an alternate "BrainGate pathway". The user can use that cursor to control the computer, just as a mouse is used."
BrainGate is a brain implant system developed by the bio-tech company Cyberkinetics in 2003 in conjunction with the Department of Neuroscience at Brown University. The device was designed to help those who have lost control of their limbs, or other bodily functions, such as patients with amyotrophic lateral sclerosis (ALS) or spinal cord injury. The computer chip, which is implanted into the patient and converts
NUERO CHIP:
Currently the chip uses 100 hair-thin electrodes that 'hear' neurons firing in specific areas of the brain, for example, the area that controls arm movement. The activity is translated into electrically charged signals and are then sent and decoded using a program, which can move either a robotic arm or a computer cursor. According to the Cyberkinetics' website, three patients have been implanted with the BrainGate system. The company has confirmed that one patient (Matt Nagle) has a spinal cord injury, whilst another has advanced ALS.
In addition to real-time analysis of neuron patterns to relay movement, the Braingate array is also capable of recording electrical data for later analysis. A potential use of this feature would be for a neurologist to study seizure patterns in a patient with epilepsy. Braingate is currently recruiting patients with a range of neuromuscular and neurodegenerative conditions for pilot clinical trials in the United States.
WORKING:
Operation of the BCI system is not simply listening the EEG of user in a way that lets tap this EEG in and listen what happens. The user usually generates some sort of mental activity pattern that is later detected and classified.
PREPROCESSING:
The raw EEG signal requires some preprocessing before the feature extraction. This preprocessing includes removing unnecessary frequency bands, averaging the current brain activity level, transforming the measured scalp potentials to cortex potentials and denoising. Frequency bands of the EEG : . Band Alpha (_) -rhythm Beta (_) Theta (_) Delta (_) Frequency [Hz] 8-12 9-11 14 -30 4-7 <3 Amplitude [_V] 10 -150 varies 25 varies varies Location Occipital/Parietal regions Precentral/Postcentral regions typically frontal regions varies varies
DETECTION:
The detection of the input from the user and them translating it into an action could be considered as key part of any BCI system. This detection means to try to find out these mental tasks from the EEG signal. It can be done in time-domain, e.g. by. comparing amplitudes of the EEG and in frequency-domain. This involves usually digital signal processing for sampling and band pass filtering the signal, then calculating these time -or frequency domain features and then classifying them. These classification algorithms include simple comparison of amplitudes linear and non-linear equations and artificial neural networks. By constant feedback from user to the system and vice versa, both partners gradually learn more from each other and improve the overall performance.
CONTROL:
The final part consists of applying the will of the user to the used application. The user chooses an action by controlling his brain activity, which is then detected and classified to corresponding action. Feedback is provided to user by audio-visual means e.g. when typing with virtual keyboard, letter appears to the message box etc.
TRAINING:
The training is the part where the user adapts to the BCI system. This training begins with very simple exercises where the user is familiarized with mental activity which is used to relay the information to the computer. Motivation, frustration, fatigue, etc.
apply also here and their effect should be taken into consideration when planning the training procedures
BIO FEEDBACK:
which is returned to the source that created it, so that source can understand it and have control over it. This biofeedback in BCI systems is usually provided by visually, e.g. the user sees cursor moving up or down or letter being selected from the alphabet.
NAGLES STATEMENT:
I can't put it into words. It's justI use my brain. I just thought it. I said, "Cursor go up to the top right." And it did, and now I can control it all over the screen. It will give me a sense of independence.
OTHER APPLICATIONS:
Rats implanted with BCIs in Theodore Berger's experiments.Several laboratories have managed to record signals from monkey and rat cerebral cortexes in order to operate BCIs to carry out movement. Monkeys have navigated computer cursors on screen and commanded robotic arms to perform simple tasks simply by thinking about the task and without any motor output. Other research on cats has decoded visual signals. Garrett Stanley's recordings of cat vision using a BCI implanted in the lateral geniculate nucleus (top row: original image; bottom row: recording) In 1999, researchers led by Garrett Stanley at Harvard University decoded neuronal firings to reproduce images seen by cats. The team used an array of electrodes embedded in the thalamus (which integrates all of the brains sensory input) of sharpeyed cats. Researchers targeted 177 brain cells in the thalamus lateral geniculate nucleus area, which decodes signals from the retina. The cats were shown eight short movies, and their neuron firings were recorded. Using mathematical filters, the researchers decoded the signals to generate movies of what the cats saw and were able
In the 1980s, Apostolos Georgopoulos at Johns Hopkins University found a mathematical relationship between the (based on a cosine function). He also found that dispersed groups of neurons in different areas of the brain collectively controlled motor commands but was only able to record the firings of neurons in one area at a time because of technical limitations imposed by his equipment.[4] There has been rapid development in BCIs since the mid-1990s.[5] Several groups have been able to capture complex brain motor centre signals using recordings from neural ensembles (groups of neurons) and use these to control external devices, including research groups led by Richard Andersen, John Donoghue, Phillip Kennedy, Miguel Nicolelis, and Andrew Schwartz.
Diagram of the BCI developed by Miguel Nicolelis and collegues for use on Rhesus onkeys
Later experiments by Nicolelis using rhesus monkeys, succeeded in closing the feedback loop and reproduced monkey reaching and grasping movements in a robot arm. With their deeply cleft and furrowed brains, rhesus monkeys are considered to be better models for human neurophysiology than owl monkeys. The monkeys were trained to reach and grasp objects on a computer screen by manipulating a joystick while corresponding movements by a robot arm were hidden.The monkeys were later shown the robot directly and learned to control it by viewing its movements. The BCI used velocity predictions to control reaching movements and simultaneously predicted hand gripping force. Other labs that develop BCIs and algorithms that decode neuron signals include John Donoghue from Brown University, Andrew Schwartz from the University of Pittsburgh and Richard Andersen from Caltech. These researchers were able to produce working BCIs even though they recorded signals from far fewer neurons than Nicolelis (1530 neurons versus 50200 neurons). Donoghue's group reported training rhesus monkeys to use a BCI to track visual targets on a computer screen with or without assistance of a joystick (closed-loop BCI).
[10]
Schwartz's group created a BCI for three-dimensional tracking in virtual reality and