About this Article
Written by: Evan Amato
Written on: May 8th, 2014
Tags: biomedical engineering, health & medicine
Thumbnail by:
About the Author
Evan is a Junior Biomedical Engineer from Menlo Park California. He enjoys writing, staying up-to-date on cutting-edge biotechnology, and long walks on the beach. He is enthralled by the potential and possibilities our brains conceal.
Stay Connected

Volume XVI Issue II > "Use the Force, Luke"
Since the introduction of EEG technology in 1924, the reading and processing of neural signals has reached a remarkable level of sophistication. This has allowed the invention and development of the brain-machine interface, which allows for a direct connection from the brain to the surrounding world. In many cases, BMIs allow for control of or communication with the environment when the user physically cannot, whether this results from spinal cord injury or debilitating neuromuscular disorder. Promising applications to aid disabled persons include BMI integrated prosthetics and, looking further down the road, the concept of the Smarthouse.
“Use the force, Luke”
Your alarm goes off in the morning and you are helped to your feet, as you were yesterday and as you will be tomorrow. You might have ALS, otherwise known as Lou Gehrig’s disease. You might have cerebral palsy, or multiple sclerosis. You might have suffered a spinal cord injury, or a traumatic brain injury. Your muscle function is severely impaired, and you have trouble getting to the bathroom, turning on the television, using your telephone. You have a neuromuscular disorder, and you depend on daily assistance to complete tasks as seemingly simple as getting out of bed each morning. However, you are able to help yourself by means of technology: the use of a brain-machine interface (BMI).

A Brief History of the BMI

In 1924, Hans Berger, a German psychiatrist, became the first to record human brain signals, noting – among other phenomena – the patterns observable during mental effort. He later published a paper entitled “Über das Elektrenkephalogramm​ des Menschen”—in English, “About the Electroencephalogram​ in Man”—thus coining electroencephalograp​hy [1].
Known as an EEG, the electroencephalogram​ test gave Berger and neuroscientists around the world access to elusive and formerly mysterious neural signals. Consequently, this revealed an exciting possibility: what can we do with these signals?
William Grey Walter pioneered the pursuit of an answer with his slide projector experiment in 1963. A neurophysiologist, roboticist, and native Missourian, Walter used Berger’s EEG technology to record the brain activity of his patients while instructing them to advance the slides on a projector using a given remote control [2]. Walter did not, however, tell his patients that the remote was a fake. The projector was directly connected to an EEG machine and responded to the neural signals of the test subjects. Each subject, albeit unknowingly, was successfully able to advance the slide projector with their thoughts. This trial marked the first use of a brain-machine interface [3].
Partnered with modern computing capability and biological understanding of the brain, BMI technology has turned the stuff of George Lucas’ wild imagination into a very promising reality—the ability to directly influence the environment with the brain. A thorough understanding of BMIs and how they are used, however, necessitates an understanding of the biology they so successfully mimic.

From Biology to Technology

The human body controls or communicates with its environment through a series of steps, beginning with a single thought of intent. This spark of intention activates various centers in the brain, which are responsible for sending signals via the peripheral nervous system to the involved muscles. These muscles perform an action, culminating what is called the efferent pathway. Inversely, the afferent pathway follows information as it travels from sensory receptors back to the brain [4].
A brain-machine interface, otherwise called a brain-computer interface, mind-machine interface, or direct neural interface, serves as an alternative to the efferent pathway. Figure 1 depicts the basic biological system as well as this alternate channel. As its names suggest, a BMI provides a direct connection from the brain to a computer, machine, or other device that can interact with the user’s surroundings. In the common case that a user has a debilitating neuromuscular disorder, the BMI allows users to interact with their environments even when their bodies cannot.

How a Brain-Machine Interface Works

A BMI is usually comprised of two components: the first to read, interpret, and translate brain activity, and the second to provide the user with feedback – confirmation the task has been completed – in real time. These elements roughly parallel the efferent and afferent pathways mentioned above and are both essential to successful function. A BMI’s complexity, however, stems from the first component.
Most modern BMIs utilize Hans Berger’s invention, the electroencephalogram​, to read neural signals because EEGs are relatively inexpensive and non-invasive. Most appealing, however, is the EEG’s high temporal resolution, which means it can identify signal changes within a very precise measurement of time [5]. After an EEG records the brain signals, a computer must interpret and translate them. This computer bears a complex load, as the signals are the offspring of millions of connections between billions of neurons. This complex procedure generally contains three stages: preprocessing, feature extraction, and feature selection and classification [4].
Figure 2 summarizes the flow of signal processing, beginning with preprocessing. This step focuses on filtration, and thus maximization of the signal-to-noise ratio (SNR). Possible eye twitching, muscle movement, and electrical error can contribute to signal noise, which simple frequency filters remove. As its name suggests, preprocessing primarily prepares the signals for later analysis. Preparation continues with feature extraction, which involves increasingly specialized filters to extract signals with certain frequencies, amplitudes, and points of origin within the brain. Finally, feature selection and classification incorporates computer memory to recognize the extracted signals and catalog them accordingly. The computer knows, for instance, that a signal of 23-Hz originating from the appropriate region of the brain indicates intended movement of the right hand. With extensive preprogramming, a computer can identify a number of signals, accurately differentiating between a signal to move a hand, foot, and toe. This is, of course, an enormous over-simplification of the identification process, which in reality involves mathematically advanced pattern recognition algorithms [4][5].

Motor Imagery

When considering signal recognition, it is important to note the distinction from literally reading thoughts. A brain-machine interface is not a robotized psychic, nor a clairvoyant creation of lights and clockwork. No matter how hard you imagine a number between 1 and 10, a BMI cannot guess it.
Instead, a BMI interprets the electrical output of the brain, which requires pattern recognition and thus a level of congruity among neural signals. This congruity is achieved through mental strategy. Most often, a BMI user imagines moving a body part—what is referred to as motor imagery—as this mental strategy. The imagery produces an oscillation in the neural signal called a sensorimotor rhythm (the name given to fluctuations originating from sensory and motor regions of the brain) almost identical to that of the corresponding physical movement. Common imagery that produces reliable and readable oscillations includes movement of the hands, feet, and tongue. [4]
Let us return to the failed guessing game with a solution. First, you preprogram the BMI with a mental strategy for numbers 1 through 10, each with different motor imagery. Movement of the left foot corresponds to 1; movement of the left arm corresponds to 2; movement of the tongue to the roof of your mouth corresponds to three, and so on. When you imagine clenching your right fist, say, the BMI would be able to guess the number 7.


BMIs have reached a remarkable level of sophistication. Far more often than to guess numbers, brain-machine interfaces translate signals to execute commands that guide devices to interact with the environment. Many of these applications are already on the market, and many more will be soon.
BMIs are researched for use in military training, computer gaming, virtual reality, and robotics. Engineers at the University of Minnesota, for instance, have developed a thought-controlled flying machine—a quadcopter. Using an EEG electrode cap for detection of motor imagery, Wi-Fi to transmit the captured signals, and a mounted camera to provide user feedback, the team can fly their copter in impressively complicated flight patterns [6]. Arguably the most successful area of application in recent years, however, is the field of prosthetic limbs. BMIs read the signals in the nerve endings of an amputated limb, translate them to direct a prosthetic replacement, and complete the afferent feedback pathway that sends signals back to the brain. This technology allows remarkably increased control of prosthetic limbs, restoring close to pre-injury ability [7]. Figure 3 displays an excellent example of a successful prosthetic using BMI technology.
BMIs have vast potential to help people with disabling paralysis or neuromuscular disease that have difficulty with transportation, communication, and general self-sufficiency because of their limited muscle control, and there is arguably no idea more exciting idea on the innovative horizon than the Smarthouse.

The Smarthouse

The Smarthouse is an integrated system of brain-machine interfaces designed to allow a disabled individual to navigate the home and operate basic household tools and appliances without the physical ability to do so, which would give paraplegics, quadriplegics, disabled persons, and the elderly alike a greatly heightened level of independence. This automated home involves three major categories of application: communication, movement, and environmental control [4]. Communication allows for users to surf the web or use the telephone, while movement encompasses everything from the aforementioned prostheses to BMI integrated wheelchairs or stair lifts. Finally, environmental control is the namesake of the concept and ties everything together: the doors, windows, lights, and sound system of a Smarthouse respond to thought, or are “smart.”
BMI integration competes with technology such as eye-tracking and speech recognition. These alternate methods allow disabled people to operate wheelchairs, type on a computer, and communicate with those around them. However, eye tracking is expensive and prone to interference. Speech recognition is only useful to those users who can speak freely, which is not the case for people with several neuromuscular diseases like ALS and cerebral palsy. This is not to say that BMI technology, in terms of Smarthouse integration, does not have its problems. The apparatus used to read brain signals, usually an electrode cap, can be unattractive and cumbersome to apply; users would not want to wear it all the time, yet they would also not want to frequently remove and replace it. In signal processing, there are further issues regarding limited bandwidth, reliability, and speed. The single most important ingredient for a successful Smarthouse, though, is the development of a singular interface. For such large-scale application to be reasonable, a user must be able to control any device in the home with a single brain-machine interface. In other words, we must establish a universal BMI system [4].
Exciting progress on the Smarthouse concept can be found around the world. g.tec, an Austrian medical engineering company, has prototyped a smart home virtual reality, which is successfully navigable with BMI technology after a few hours of training [8]. Thought-Wired, from Auckland, New Zealand, has created a BMI that can toggle between several commands to use a telephone and turn on lights around a home [9]. A group called BrainAble attempts to address the issue of a universal system by combining different signal reading techniques with various mechanisms for physical control [10]. Notwithstanding this promising evolution, the Smarthouse concept is still just that—a concept. It will likely be decades before the BMI technology necessary for a Smarthouse is fully operational and sees widespread use.

Looking to the Future

Brain-machine interface technology has advanced tremendous in the past several years. From more efficient processing techniques to flying quadcopters to functional prosthetics, the level of technological advancement today would have been considered something from a Hollywood sci-fi movie no more than 40 years ago. To predict the future of BMIs, perhaps we should look to science fiction.
Your alarm goes off in the morning and you maneuver your wheelchair to your bedside, where it stays as you help yourself into the seat. You roll into the kitchen and open the window curtains, get yourself a glass of water, and turn on the television. You have a neuromuscular disorder, and your muscle function is severely impaired. Yet, you are able to do all of this without lifting a finger. Your house is fully automated, and responds to your thoughts via complex brain-machine technology. You struggled to get out of bed each morning for years before the Smarthouse, but now you can do all of this by yourself—unless, of course, you count the C-3P0-esque assistant that may very well be making you breakfast.


    • [1] L. F. Haas. (2003, January). Hans Berger (1873-1941), Richard Caton (1842-1926), and electroencephalograp​hy in the Journal of Neurology, Neurosurgery, and Psychiatry (Volume 74) [Online Journal]. Available:​content/74/1/9.full.​html.
    • [2] A. E. LeBouthillier. (2010). W. Grey Walter and his Turtle Robots [Online]. Available:​content/w-grey-walte​r-and-his-turtle-rob​ots.
    • [3] S. Hays, J. Robert, et. al. “The Age of Neuroelectronics,”​ in Nanotechnology, the Brain, and the Future, 2013, ch. 7, pp. 130.
    • [4] B. Graimann, B. Allison, et al. “Brain-Computer Interfaces: A Gentle Introduction,” in Brain-Computer Interfaces, 14th ed. 2010, ch. 1, pp. 1-27.
    • [5] R. Hornero, R. Corralejo, et al. (2008). Brain Computer Interface (BCI) systems applied to cognitive training and home automation control to offset the effects of ageing [Online]. Available:​/lychnos/en_en/artic​les/Brain-Computer-I​nterface
    • [6] K. LaFleur, K Cassady, et al. “Quadcopter control in three-dimensional space using a noninvasive motor imagery-based brain–computer interface,” in Journal of Neural Engineering, Volume 10, June 2013.
    • [7] M. Pandika. (2013, October). Researchers Developing Brain-Controlled Prosthetic Devices [Online]. Available: http://www.usatoday.​com/story/news/healt​h/2013/10/30/brain-c​ontrol-prosthetics/3​316343/
    • [8] (2009, May). Virtual Smart Home Controlled By Your Thoughts [Online]. ScienceDaily. Available:​/releases/2009/05/09​0511091733.htm.
    • [9] J. Ford. (2011, May). Thought-Wired Allows Disabled to Control Home Appliances With Mind Alone [Online]. Available: http://singularityhu​​ught-wired-allows-di​sabled-to-control-ho​me-appliances-with-m​ind-alone/.
    • [10] F. Miralles. (2013, May). European Brain Research: successes and new challenges [Online]. Availabe:​research/conferences​/2013/brain-month/pd​f/s3_2_felip_miralle​s_connecting_the_dis​abled_to_their_physi​cal_and_social_world​.pdf.