d
Artificial Intelligence: I know I mention this as a topic and your thought immediately goes to the movie Terminator: droids that are built to look, act, and sound like humans who are intent on taking over the world. Don’t worry, droids taking over the world probably won’t happen any time soon.
What is closer than we might realize is the integration of Artificial Intelligence (AI) into our everyday world.
The concept has been growing for years, but it’s never been closer to reality than today. Artificial Intelligence is finally living up to its potential. It’d be helpful to know what we’re talking about when it comes to Artificial Intelligence.
As silly as it may sound, ‘artificial’ may not be as clearly understood as it seems, and also, ‘intelligence’ may have different levels of expectation from each of us. The human brain as the home of natural intelligence is, of course, the birthplace of artificial intelligence. The human brain experiences a cacophony of electrical activity that translates into motor movement, spoken and nonverbal communication, situational processing, and at the very basic level, our ‘fight or flight’ response.
A Story of a Monkey and AI Changing Lives
Dr. Miguel Nicolelis, Professor of Neuroscience at Duke University (Durham, NC), became fascinated with ‘brain storms’: electrical activity caused by neurons firing between the more than 100 million cells in the human brain. Dr. Nicolelis’ research recorded this neural activity and decoded the neurons as a sort of alphabet to communicate with extensions and devices outside the body.
Dr. Nicolelis and his team developed a Brain Machine Interface from 2000 to 2001 with a multi-channel sensor system to receive the electrical communication from the brain. The receptor system then processed the electrical signals as part of real-time analysis of brain activity. The sensors are designed to specifically look for any signals that are connected to motor movement: raising an arm, shifting a foot, flexing fingers, or even standing up from a seated position.
Any motor movement information was then sent through a telemetry processor to a 3D artificial limb, such as a robotic arm. But the question remained, how well did the translation of the motor movement work, from the brain’s electrical impulse to the robotic arm? Dr. Nicolelis’ team started experimenting with a rhesus monkey named Aurora in early 2003. The research team monitored and recorded Aurora’s brain activity while playing a simple computer game with a joystick. If Aurora completed a basic challenge in the game, she received an automated drip of Brazilian orange juice as her reward.
Aurora’s ‘brain storms’ were uploaded to a robotic arm using the Brain Machine Interface so the computer and a robotic arm could begin learning Aurora’s impulse and motions to play the same game. When Dr. Nicolelis’ research team switched to the Brain Machine Interface after thirty days, Aurora was able to simply think of the direction to move the sensor in the computer game and the robotic arm responded based on her brain activity.
Two Monkeys Changing The World
In a similar situation, researchers from the University of Pittsburgh and Carnegie Mellon University experimented with two macaques monkeys in using robotic arms to eventually establish brain control over an artificial appendage. The scientists identified 100 motor neurons that a computer analyzed in their electrical activity and translated the neural activity into an electronic command to move the robotic arm. The arms were mounted flush with the macaques’ left shoulders and the computer initially helped the monkeys move the robotic arm to help establish motion control.
As the monkeys learned to adopt the movements, the research team noticed an adaptation of movement that could not be anticipated in virtual environments. The testing results show the brain’s amazing ability to adopt, adjust, and use a prosthetic appendage based solely on the brain’s motor activity fired in a specific area of the cortex. Dr. John F. Kalaska, neuroscientist at the University of Montreal, after seeing the macaques’ progress, noted that, “[Brain-activated prosthetic limb adoption] would allow patients with severe motor deficits to interact and communicate with the world not only by the moment-to-moment control of the motion of robotic devices, but also in a more natural and intuitive manner that reflects their overall goals, needs and preferences.”
So, if the brain is capable of creating the right type of data to control a prosthetic limb, could the brain control more than one prosthetic limb at a time? Dr. Nicolelis and his research team posed that same question while continuing to study Aurora’s ‘brain storms’. The difference is that Aurora was controlling a single robotic arm over 7,000 miles away at Kyoto University in the Kyoto Prefecture of Japan. The control signal between Aurora’s brain and the robotic arm at Kyoto University was registered at 20 milliseconds faster than the brain signal between her brain and other muscles in her body.
The Duke University research team added a second monkey to their experiment… and a second robotic arm for both monkeys. Implants in the monkeys’ brains tracked and translated between 374 to 497 motor-controlling neurons to send the appropriate signal to the robotic arms. The two rhesus monkeys have successfully controlled both arms at the same time using a new and improved bimanual brain-interface machine. The results are promising because, of course, the ultimate goal isn’t just to allow perfectly functional monkeys to control robotic arms. The hope is to empower paraplegics and amputees with the brain-controlled capabilities to enjoy life without limits.
What Does This Mean For Humans?
To put this simply, this research proves that our brain has the ability to form new pathways. How does this translate into our daily lives? It means that while you are working there is the potential for your brain to be controlling a robot at home cleaning your house. It takes the concept of multitasking to a whole new level.
Think about the possible implications!
But, how can artificial intelligence be applied for quadriplegics? What if no neural activity is registering any motor control inside a human body? The same advances in brain-machine interface technology are now allowing monkeys to control a robotic wheelchair simply by thinking. Dr. Nicolelis and his team monitored the brain activity of two rhesus monkeys that were trained to maneuver a wheelchair just by watching it move. The monkeys transitioned to using their brains’ neuron signals to navigate a two-meter path across the room to retrieve grapes from a dispenser. The experiment required careful insertion of intracranial implants to register the monkeys’ neural activity for far superior motor control of the wheelchair.
The data received from monitoring the two monkeys’ brain activity while telematically controlling the wheelchair is the same type of data that may be used in the future to improve the livelihood of severely disabled people. People suffering from Amyotrophic Lateral Sclerosis (ALS), Parkinson’s, or any number of motor neuron diseases, now have hope of controlling their livelihood. Dr. Nicolelis and his team have now started implementing the discoveries and data tracking capabilities of their brain-machine interfaces into human experimentation.
This article is an excerpt from NEO founder Jesse Morris’ new book Data and the World of Today: The Reality of Today that will Impact your Business Tomorrow. Purchase your copy via Amazon.com.