THIS ARTICLE/PRESS RELEASE IS PAID FOR AND PRESENTED BY NTNU Norwegian University of Science and Technology - read more

A brain implant could provide electrical stimuli to the centre that processes the artificial sensory impression. No, it won’t look like this.

Getting a wireless network under the skin to talk to the brain

How can we get an artificial hand or foot to communicate with the brain? NTNU researchers want to use the fat layer just under our skin.

Imagine that you had to have your arm amputated and now you have to manage with an artificial hand. You can move it around, push things, press a light switch.

But you can’t use your fingers to feel things, to sense whether what you are touching is hot or cold, or if you are grasping something too hard or not enough.

Now researchers at NTNU are working to develop solutions where the brain will be able to capture sensory impressions from a prosthesis, process them and use them to control movements, almost as if it were a normal hand. The NTNU approach won’t require imbedded batteries and wires.

Biological signal processing: Ali Khalegi, Sandra Yuste Murios and Ilangko Balasingham at NTNU’s Department of Electronic Systems. They have now started testing artificial nerve signals on monkeys.

Creates sensory impressions

“An artificial arm has no sense of feeling to help control the arm. You can use your sight, but you don’t have the sense of touch. We’re trying to develop a solution that integrates a number of sensors on the arm and hand and sends signals to the part of the brain that processes this type of sensory impression,” says Ilangko Balasingham, a professor at NTNU’s Department of Electronic Systems.

The idea is to use as much of the body’s own systems as possible – and as few artificial parts as possible.

“The point is to communicate with the brain without having to imbed any wires in the body. Wires increase the risk of infections. They can get in the way or break. They pose a problem,” says Balasingham.

Instead, information from the sensors will be sent in the form of microwaves – radio waves in the 400 to 2500 MHz range of the electromagnetic spectrum – through the fat layer we have under the skin.

“Our subcutaneous fat lends itself to this since it has less attenuation of electrical signals,” says Balasingham.

Balasingham says this works because of a factor called the dialectic constant. Fat has a value of 5 whereas that of air is 1. This means that radio signals don’t decrease as much as in air, since air is a 1 on the scale.

“Fat has some attenuation at 5 compared to muscles , which have a value of 58 and a much higher attenuation rate. This means that if we can use the fat layer to transmit radio signals, the signal won’t lose much strength, so it can be transmitted over longer distances,” he said.

Brain implant

The signal is received in the brain by an implant that provides electrical stimuli to the relevant centre for processing the ‘sensory impression’.

“Over time, the brain becomes trained to understand what the signal means. One particular signal might mean ‘warm’ while another one signifies ‘cold’. We currently have good knowledge of how the brain processes this kind of information, and we know how the brain is able to learn to interpret new signals,” says Balasingham.

Once the brain processes the artificial sensory impression, another implant will pick up the brain’s ‘output signal’ and send it back to the prosthesis so that it can carry out the movements the brain requests. Researchers are currently testing the system on monkeys.

“The monkeys will now use this system for half a year and enable us to acquire data on how the brain is learning to use the signals,” says Balasingham.

Using brain signals

How does this communication between the human brain and the electronic components actually work?

Balasingham uses the YouTube ‘Monkey Mind Pong’ video as an example. The video shows a research project funded by Tesla millionaire Elon Musk, in which researchers teach a monkey play a simple video game using a joystick.

The monkey has had implants inserted into its brain that register how the neurons fire while the hand moves the joystick. A computer detects the activity of the brain. The artificial intelligence (AI) system uses the information to predict which movements the brain will ask the hand to perform.

Eventually, the joystick is disconnected from the computer and the monkey continues to play, now with direct wireless communication between the implant in the brain and the video game.

“Once the brain has learned to interpret the signals, it’s enough for the monkey to think that it is moving the joystick. If you have a normal hand, the brain knows how to interpret the nerve signals. If you’ve lost an arm and need to start using a prosthesis, the brain has to relearn how to interpret the signals,” Balasingham says.

Medical electronics: Sandra Yuste Murioz is a doctoral candidate at NTNU’s Department of Electronic Systems. She is part of the research group that is developing wireless communication between the brain and prostheses.

No big challenge for the brain

The NTNU researchers say the fact that the nerve signals being sent to the brain are artificial should not be a big challenge for the human brain.

In the long run, they envision that brain implants will be able to perceive nerve signals and send stimuli to all parts of our nervous system. In this way, a break in the neural pathway between the brain and the rest of the body will not necessarily incapacitate our ability to maintain the use of certain parts of the body.

“Every year, between 250 000 and 500 000 spinal cord injuries are registered worldwide. By designing wireless, high-speed two-way communication, we’ll enable the human brain to communicate directly with equipment and computers. We charge the system wirelessly while simultaneously transferring data to the implant. Using this system, which is always on, could allow people to continuously receive sensory impressions and stimulate nerve cells anywhere in the nervous system, without outside medical help,” says Balasingham.


Read the Norwegian version of this article at

Powered by Labrador CMS