mand the movement of a mobilerobot according to sig- nals captured from the user’s brain. These signals are acquired and interpreted by Emotiv EPOC device, a 14-electrode type sensor which captures electroenceph- alographic (EEG) signals with high resolution, which, in turn, are sent to a computer for processing. One brain-computerinterface (BCI) was developed based on the Emotiv software and SDK in order to command the mobilerobot from a distance. Functionality tests are performed with the sensor to discriminate shift inten- tions of a user group, as well as with a fuzzy controller to hold the direction in case of concentration loss. As con- clusion, it was possible to obtain an efficient system for robot movements.
Abstract—This paper presents a novel braincomputerinterface (BCI) design employing visual evoked potential (VEP) modu- lations in a paradigm involving no dependency on peripheral muscles or nerves. The system utilizes electrophysiological corre- lates of visual spatial attention mechanisms, the self-regulation of which is naturally developed through continuous application in everyday life. An interface involving real-time biofeedback is described, demonstrating reduced training time in comparison to existing BCIs based on self-regulation paradigms. Subjects were cued to covertly attend to a sequence of letters superimposed on a flicker stimulus in one visual field while ignoring a similar stimulus of a different flicker frequency in the opposite visual field. Classification of left/right spatial attention is achieved by ex- tracting steady-state visual evoked potentials (SSVEPs) elicited by the stimuli. Six out of eleven physically and neurologically healthy subjects demonstrate reliable control in binary decision-making, achieving at least 75% correct selections in at least one of only five sessions, each of approximately 12-min duration. The highest-per- forming subject achieved over 90% correct selections in each of four sessions. This independent BCI may provide a new method of real-time interaction for those with little or no peripheral control, with the added advantage of requiring only brief training.
84 necessary for the communication or control task. The activity resulting from this process is often called motor output or efferent output. Efferent means conveying impulses from the central to the peripheral nervous system and further to an effector (muscle). Afferent, in contrast, describes communication in the other direction, from the sensory receptors to the central nervous system. For motion control, the motor (efferent) pathway is essential. The sensory (afferent) pathway is particularly important for learning motor skills and dexterous tasks, such as typing or playing a musical instrument. A BCI offers an alternative to natural communication and control. A BCI is an artificial system that bypasses the body’s normal efferent pathways, which are the neuromuscular output channels illustrates this functionality. Instead of depending on peripheral nerves and muscles, a BCI directly measures brain activity associated with the user’s intent and translates the recorded brain activity into corresponding control signals for BCI applications. This translation involves signal processing and pattern recognition, which is typically done by a computer. Since the measured activity originates directly from the brain and not from the peripheral systems or muscles, the system is called a Brain–ComputerInterface.
Amyotrophic lateral sclerosis, or ALS, is a degenerative disease of the motor neurons which results in complete paralysis of the victim. We are developing a wheel chair system that aids people suffering from ALS as well as SCPs. The system must be usable with minimal infrastructure modification. It must be safe and relatively low cost and must provide optimal interaction between the user and the wheelchair within the constraints of the brain-computerinterface. Our Control strategy involves controlling the wheel chair in a closed environment like office, home, hospitals etc. Therefore we have proposed a new idea to aid such people, by using their mind to control such devices with more ease.
well as myoclonic seizures using a phase clustering index and reached a sensitivity of 85%. Their algorithm was not implemented and evaluated in an on-line setting, in contrast to the algorithm presented in the current paper. The inclusion of additional sleep criteria intended to decrease the number of false alarms but reduced also the sensitivity of the prediction of SWDs from 88 to 45%. It needs to be pointed out that the same algorithm quickly detected all the remaining, unpredicted SWDs. In terms of SWD detection, this algorithm therefore still keeps a sensitivity of 100% for SWD prediction and early detection altogether. The inclusion of sleep criteria also reduced the number of false detections by 83%. We felt that especially in the light of stimulation safety (with the aim to interfere as often as necessary but as little as possible) such an inclusion is useful. The putative negative effects of interference by high frequency electrical stimulation of the brain was controlled by using behavioural activity as a read-out parameter, which did not significantly change between the baseline and stimulation session. In the long run, however, additional parameters like a combined video-EEG analysis, a thorough histological inspection of the stimulation site and analysis of sleep and wake states over longer (>24 h) stimulation sessions remain neces- sary for an accurate assessment of stimulation safety.
Chapter 3 Robotic Vehicle Control measurement, noise cannot be avoided, but using digital signal processing, the signal quality can be improved. However, simple ﬁlters are not suﬃcient in biomedical application. Especially in EEG signal processing. Many a times, the noise power cannot be removed using ﬁlters because it poses risk to reject the original signal generated from the human brain. Thus adaptive algorithms should be used for noise removal . Later in this chapter, in section 3.4.2, we would see that the SSVEP signal contains multiple harmonics. thus we can also use comb ﬁlters besides adapive algorithms to enhanche the ssvep signals.
However, the vastly most popular mental strategy for BCIs is motor imagery. That is, the imag- ined movement of a limb, usually an arm or both feet. The preparation of movement elicits a response in the EEG very similar to the response that resulting from actual movement of limbs. As illustrated in Figure 2.6 , this response can be observed as an ERD and ERS in the SMR. ERD and ERS patterns follow a homuncular organization: Activity invoked by right hand move- ment imagery is most prominent over C3. Activity invoked by left hand movement imagery is most prominent over C4. Patterns which originate from the same cortical area is hard to dis- criminate. Coordination of both feet are handled in the same cortical area, therefore it is hard to discriminate between the left and right foot. Movement of each hand originate from opposite hemispheres, and are better suited to use as two distinct control signals for BCIs.
SPINAL or vertebral column is the most important part in our body where the major functions are to protect the spinal cord, nerve root and also the internal organs. Spinal cord injury occurs when there is any damage to the spinal cord that blocks communication between the brain and the body. When the spinal cord injured, a person‟s sensory, motor and reflex messages are affected and may not be able to function as usual. The higher the level of injury, the more dysfunction can occur. This may result in partial or complete paralysis of the body as well as complete paralysis of the arms and legs. For persons with a highest level of Spinal Cord Injury (SCI), they are only able to control a muscle movement from a neck and above. To gain an independent mobility, a power electrical wheelchair with an alternative or hands free interface is crucial since normal joystick is not viable anymore. The medium can be developed by utilizing information generated from eyes, tongue, voice and brainwave. Nikhil Shinde and Kiran George proposed a new design of “Brain-Controlled Driving Aid for Electric Wheelchairs”. In this paper a BCI based electric wheelchair driving aid design that utilizes mental concentration (EEG signals) and eye blinks (EMG signals) of the user, is used. The design incorporates a safety controller with peripheral safety sensors that override the user command and stop the wheelchair when it detects an obstacle in its path .The experimental results show that the average success rate for detecting blinks needed to change the direction of the wheelchair was 85%. Subjects took an approximately 50 seconds to drive a predefined path of approximately 20 feet. This kind of wheelchair can categorize as an intelligent wheelchair as it operate base on computerinterface. Imran Ali Mirza, Nikhil Sharma designed a new design which is “Mind-Controlled Wheelchair using an EEG Headset and Arduino Microcontroller”. An attempt has been made to propose a thought controlled wheelchair, which uses the captured signals from the brain and eyes and processes it to control the wheelchair. Electroencephalography (EEG) technique deploys an electrode cap that is placed on the user‟s scalp for the acquisition of the EEG signals which are captured and translated into movement commands by the arduino microcontroller which in turn move the wheelchair .The information that collected from the action of eyes, tongue, voice or brainwave then will be
The principle of operation is kind of easy. 2 dry sensors are accustomed find and filter the EEG signals. The device tip detects electrical signals from the forehead of the brain. At constant time, the device devours close noise generated by human muscle, computers, lightweight bulbs, electrical sockets and alternative electrical devices  The second device, ear clip, could be a grounds and reference that permits think gear chip to strain the electrical noise. The device measures the raw signal, power spectrum (alpha, beta, delta, gamma, theta), attention level, mediation level and blink detection.  The raw EEG information received at a rate of 512 Hz. alternative measured values are created each second. Therefore, raw EEG information could be a main supply of knowledge on EEG signals victimization Mind Wave MW001.
Once the BCI has predicted the user’s mental task, it sends the corresponding command to the computer, which performs the corresponding action. The user ob- serves this response as feedback, completing the BCI cycle as shown in Figure 2.1. Possible applications include brain-controlled motorized wheelchairs , remotely- controlled assistive robots that can navigate a building [14, 15], improving rehabili- tation methods , and even prosthetic limbs that respond to neural signals like a biological limb . Additionally, BCIs could provide an intuitive control method for able-bodied users teleoperating a robot in a remote location. This would potentially provide faster and more intuitive control, either alone or as an enhancement to tradi- tional interfaces such as joysticks, voice control, or typing commands into a computer terminal.
Biosignals can be recorded using different techniques based on which BCI can be classified as: i) Invasive: t he invasive method includes a surgical operation, in which microelectrodes grid is implanted to record intracranial signals either from the cortical surface or from inner brain tissues. These invasive BCIs carry more information compared to non-invasive but these are of high risk and are not used in real-world applications ; and ii) Non- Invasive: t his method does not require a surgical operation: fMRI and fNIRS are non-invasive methods. These measure the hemodynamic response, the delivery of blood to neuronal tissues, however, these have low temporal resolution and therefore not suitable for real-time control of MCUAV’s. EEG and MEG are also non-invasive methods which measure the brain’s electric and magnetic fields respectively and are the best suitable methods .
an environment which is complex and aids sound analysis . In a recent developed system of braincomputerinterface in the University of Bremen, the makers of the project made a human machine interface (HMI)semi-autonomous robot by the name of FRIEND II which was executed and compiled by the MASSiVE control architecture . Learning from these projects, this paper present a braincomputerinterface system where we are applying machine learning technique, support vector machine to the visual patterns of the signals to navigate a robot using an Arduino Duemilanove microcontroller . In order to extract signals for the braincomputerinterface, we use an Emotive Headset which captures neuro-signals . After fetching signals from the brain, we de-noise or filter the noise and unwanted signal interruptions from the input signals. Later we apply Support Vector Machine algorithm to classify signals to intents of our choice in order to navigate a robot. We used an Arduino Duemilanove micro- controller to transfer the output signals in order to control the robot using imagined movements.
In brain-computerinterface research aimed at directly controlling computers, temporal resolution is of utmost importance, since users have to adapt their brain activity based on immediate feedback provided by the system. Forinstance, it would be difficult to control a cursor without having interactive input rates. Hence, even though the low spatial resolution of these to low information transfer rate and poor localization of brain activity, most researchers currently adopt EEG because of the high temporal resolution it offers. However, in more recent attempts to use brain sensing technologies to passively measure user state, good functional localizatio n is crucial for modeling the users’ cognitive activities as accurately as possible. The two technologies are nicely complementary and researchers must carefully select the right tool for their particular work.
Abstract:-A BRAINCOMPUTERINTERFACE (BCI) is a direct communication pathway between the brain and an external device. The concept of BCI had coined in 1970s at the University of California Los Angeles (UCLA). Growth in neuroscience and brain imaging technologies allow us to interface directly with the human brain. This can be made possible with the help of sensors that can monitor physical processes that occur within the brain. In this technology, communication system does not depend on brain normal output pathways of peripheral nerves and muscles. In this system, user is supposed to manipulate their brain activity to produce signals that can be employed to controlcomputer by using brain waves. Practical use of BCI depends on the development of appropriate application, attention to need and desires of individual user.
Brain-computerinterface (BCI) is an interface established between a brain and a device that receives signals from the brain to command some external activity with the help of a computer or any other electro-mechanical device. It translates neuronal information into command signals which are capable of controlling external software or hardware. This interface facilitates a direct communication pathway between the brain and the target object. BCI intercepts signals from neurons and uses proper combination of hardware and software to translate the signals into commands, thereby enabling a disabled person to perform desired tasks or control a mechanized wheelchair or prosthetic limb through thoughts alone. At present brain- interface devices need deliberate conscious thoughts; but efforts are going on to develop future applications such as prosthetic control which would be able to work effortlessly. Development of electrodes or sensors for fetching the brain signals is one of the biggest challenges in developing BCI technology. It is also expected to make surgical methods for implanting these electrodes minimally invasive. Traditionally the brain accepts an implanted mechanical device in BCI and controls the device by producing desired signals. Current research in this area is focused on the development of non-invasive BCI. Brain is practically "disconnected" from its target (such as a limb or the facial musculature) in spinal cord injury, brainstem stroke, and a host of neuromuscular disorders, thereby preventing mobility and movements. BCIs are often used as assisted living devices for persons with sensory or motor impairments (Wolpaw et al, 2002; Daly et al, 2008). This is particularly a helpful aid for individuals who suffer from severe motor disabilities. Researchers have been working with BCIs based on electroencephalo graphy (EEG) with the goal of helping persons with motor impairments such as spinal-cord injury (SCI), amyotrophic lateral sclerosis or stroke survivors (Daly et al, 2008).
An asynchronous and very low information transfer rate BCI was used to generate control signal equip wheelchair with path planning, collision/obstacle avoidance, voluntary stopping and reach nine destinations. The design was derived from Support Vector Machine and Linear Classifier for ERD . A BCI user control scheme to generate left and right turn movements, timer based forward movements from motor imagery was implemented  . A P300 based BCI system was implemented to select a preferred position from a list of predefined location as a driving signal to guide a wheelchair in a known environment. The ability to terminate the movement is derived from ERD signal or fast P300 signal . A Self-reliant system was developed by combining a P300 BCI and navigation system. This system exhibited an ability to drive a wheelchair in an unknown environment .  Proposed a Bayesian network based brain-controlled robot consisting of a navigation system capable of identifying the most feasible solution and Error-related EEG signals driven BCI based decision making system. A Safe Wheelchair navigation mechanism was implemented using Steady State Visual Evoked Potential based BCI to traverse in four directions .
For many years people have speculated that electroencephalographic activity or other electrophysiological measures of brain function might provide a new non-muscular channel for sending messages and commands to the external world – a brain–computerinterface (BCI). Over the past 15 years, productive BCI research programs have arisen. Encouraged by new understanding of brain function, by the advent of powerful low-cost computer equipment, and by growing recognition of the needs and potentials of people with disabilities, these programs concentrate on developing new augmentative communication and control technology for those with severe neuromuscular disorders, such as amyotrophic lateral sclerosis, brainstem stroke, and spinal cord injury. The immediate goal is to provide these users, who may be completely paralyzed, or ‘locked in’, with basic communication capabilities so that they can express their wishes to caregivers or even operate word processing programs or neuroprostheses. Present-day BCIs determine the intent of the user from a variety of different electrophysiological signals. These signals include slow cortical potentials, P300 potentials, and mu or beta rhythms recorded from the scalp, and cortical neuronal activity recorded by implanted electrodes. They are translated in real-time into commands that operate a computer display or other device. Successful operation requires that the user encode commands in these signals and that the BCI derive the commands from the signals. Thus, the user and the BCI system need to adapt to each other both initially and continually so as to ensure stable performance. Current BCIs have maximum information transfer rates up to 10–25 bits/min. This limited capacity can be valuable for people whose severe disabilities prevent them from using conventional augmentative communication methods. At the same time, many possible applications of BCI technology, such as neuroprosthesis control, may require higher information transfer
Braincomputerinterface [BCI] is a joint effort between a cerebrum and a gadget that empowers signals from the cerebrum to direct some outside action, for example, control of a cursor or a prosthetic limb. The interface empowers an immediate correspondences pathway between the mind and the object to be controlled. The BrainComputer Interfaces (BCI) can be classified as invasive and non-invasive. In invasive BrainComputer Interfaces (BCI), the electrical action of the mind is recorded from inside the head with at least one microelectrodes which can record the action of a solitary neuron. The non-invasive BrainComputer Interfaces (BCI) depends on the EEG measured from scalp of the human cerebrum. BrainComputer Interfaces (BCI) can be utilized for communication, to assess a computer, or control of gadgets, for example, a wheelchair or prosthetic arm, among different applications. Basically anything that can be controlled by a PC could possibly, be controlled by a BCI. BrainComputer Interfaces (BCI) is being inspected as a rehabilitation device to help individuals recapture motor skills that are lost from stroke, and additionally a prosthetic gadget to swap or compensate for motor skills that will stay away for the indefinite future.
Our BCI mouse uses a flashing-stimuli protocol with some similarities to the P300- based BCI mice described in the previous section. More specifically, we used visual displays showing 8 circles (with a diameter of 1.5 cm) arranged around a circle at the centre of the display as in Figure 1(far left). Each circle represents a direction of movement for the mouse cursor. Circles temporarily changed colour – from grey to either red or green – for a fraction of a second. We will call this a flash. The aim was to obtain mouse control by mentally focusing on the flashes of the stimulus representing the desired direction and mentally naming the colour of the flash. Flashes lasted for 100 ms and the inter-stimulus interval was 0 ms. Stimuli flashed in clockwise order. This meant that the interval between two target events for the protocol was 800 ms. We used a black background, grey neutral stimuli and either red or green flashing stimuli.
BCI has many applications, especially for disabled persons. It reads the waves produced by the brain and translates these signals into actions and commands that can control the computer(s). A brain-computerinterface (BCI) can enable such physically challenged people to achieve greater independence by making technology accessible. BCI technology provides an alternative communication channel between the human brain (that does not depend on the brain's normal output channels of the peripheral nerves and muscles) and a computer. The three most commonly discussed diseases/ injuries cited in the BCI literature as being a case of a locked-in syndrome are ALS, high spinal cord injury and brain stem stroke.