Now Reading
The Kolkata Scientists That Want to Deal With Blindness Using Machine Learning

The Kolkata Scientists That Want to Deal With Blindness Using Machine Learning

Credit: ulrichw/pixabay

As you feel a series of embossed symbols, a computer can discern their meaning by reading the electric signals shooting in your brain. Say hello to the machine that could step in for one of your senses.

Credit: ulrichw/pixabay
An EEG being taken. Credit: ulrichw/pixabay

If you are blindfolded and asked to distinguish between an apple and an orange, you can touch the fruits and tell them apart. This is because the human brain relies on the sense of touch, among other senses, to understand the physical world around us. Now, a group of scientists from Kolkata have leveraged this ability to train computers into decoding what you are reading – in this case by touching.

The scientists have created a proof of principle brain-computer interface (BCI) that can identify 3D text as it is being touched by healthy individuals. As feel a series of embossed letters, numbers or symbols, a computer can discern the text just by reading the electric signals shooting through your brain.

This requires the brain to be connected to the computer – a.k.a. the basic premise of a BCI. In the linked state, the computer can read brain signals and even let the brain carry out simple virtual tasks like moving a cursor across the screen by merely thinking about it. The goal of such scientific advances is to create devices that will respond directly to the brain instead of having signals relayed through your body. Obviously, such advances can lead to efficient prosthetics.

Rohit Bose, the lead author of this study, has been working with BCIs for a long time. But after exploring its use with different senses like vision and smell, he realised that the sense of touch hadn’t been explored much beyond object recognition.

This field holds a lot of potential, particularly for visually challenged people. “There is an entire area of nonverbal communication that relies on touch,” Bose told The Wire. Although Bose’s work doesn’t aim to improve nonverbal communication, it tries to integrate one such existing system with burgeoning technology. Braille, the script used by blind people, is a collection of protruding dots that are felt by a hand moving over them.

Researchers had been baffled by a question on this front: Could brain signals evoked by touching text be isolated from the rest of the brain activity so that it could be interpreted by a BCI? If so, can an advanced solution for recognising objects be made for blind people?

To this end, Bose and his colleague from the TCS Innovation Lab, Kolkata and Jadavpur University, decided to examine brain activity while healthy blindfolded people touched and recognised 3D letters, numbers and symbols.

The experiment

Brain cells communicate with each other through electric signal that can be detected by sticking electrodes on the scalp. The variations in the signals from different regions are then recorded as a series of waves and spikes called an electroencephalogram (EEG).

This technique is among the early diagnostic tools used for studying the brain. The first physical readout of brain’s electrical activity was produced by a German psychiatrist named Hans Berger in 1924. He stuck two electrodes fit with a galvanometer on the brain of a boy undergoing neurosurgery and traced the electrical activity as deflections of the galvanometer needle. This gave him his first successful EEG recording with a human. Later, Berger refined his method to make it less invasive and began generating EEGs from electrodes attached to the top of the head. He recorded a lot of data with his son in this manner.

For the most part of the early 21st century, EEGs were measured through caps fit with electrodes, in turn attached to a computer. But with advances in technology, engineers have created a wireless headset that can record electrical signals from different areas of the brain and send this information over WiFi to a remote computer.

Today, these headsets are used to provide virtual gaming experiences. The headsets identify a user’s brain signals for basic movements like raising an arm or moving it in some direction. This allows the gamer to manoeuvre her virtual avatar through actual movement rather than a series of clicks.

Bose and co. used a similar headset to record the EEGs of 15 healthy volunteers (seven men and eight women) while they touched text painted on a sheet of paper. “The thick acrylic paint adds a certain depth to the text, sort of a 3D character, that can be felt by moving your hand over it,” Bose said.

All participants were made to touch different combinations of four letters, four numbers and four symbols each, chosen at random. They were also asked to call out the text they were feeling to ensure they recognised the number, digit or symbol correctly.

This exercise was repeated on five different days to even out differences in brain activity. Once the signals were acquired, scientists pored over the data looking for unique patterns associated with feeling and recognition. They focussed on signals originating in a specific region called the parietal lobe because this part is concerned with sensing touch.

Brain waves recorded by an EEG machine are categorised as alpha, beta, theta and delta based on the frequency of their spikes. Bose’s team found that the theta band, which corresponds to the frequency of electrical signals in the range of 4-5 Hz, was rich in patterns associated with touch.

So for each object presented to the participants, scientists extracted unique features from the theta band using statistical methods. Patterns isolated from 70% of the correctly identified cases were fed into an algorithm that could learn to associate the motifs with objects. Next, they tested the accuracy of the algorithm by presenting features from the remaining 30% cases and asked it to predict the text being felt.

The accuracy of this setup was about 78%, which is fairly high. The computer could succeed more than 7 out of 10 times.

“We were very happy with the results but the technique will be useful only if the process is fast enough for real-time applications,” says Bose.

So they invited the volunteers again and repeated their experiment online. In this version, the participants were asked to touch the same 3D text. The information was processed in real time using the pre-trained algorithm.

After participants touched the text, the computer was asked to identify the character being felt based on the EEG signal. The speed of processing was well within one second when the participants felt only numbers or letters or symbols. But when the experiment was repeated with different combinations one after another, both the accuracy of the prediction and the processing time took a hit.

Accuracy declined to 65% and the time for prediction took up to two seconds. “This is expected because the workload increases suddenly. But what’s remarkable is that the result could be displayed with a lag of just two seconds,” Bose said.

Improvements needed

If you tapped your finger right now on a table, there will be an immediate activity that will show up in your EEG. But because at any given time the brain also controls several other functions, extracting the features associated with taps alone is difficult.

Statistics can help. There are several statistical operations to help identify different features. While their simplicity varies from high to low, there’s a catch: the complicated operations that are very good at recognising the activity often waste a lot of time in processing the information. As a result, there are tradeoffs between the accuracy and processing speed for any proposed BCI. And if the processing times are particularly large, the usability of the BCIs is lower.

“To ensure a robust interpretation of tactile senses, we have extracted the features using a combination of three very simple statistical tools,” according to Bose. Two of them – the Hurst exponent and adaptive autoregressive parameters – identify patterns in a time-dependent manner. The third tool, called power spatial density, measures the variation in frequency of the signal.

Scientists found that when different letters were being felt, they evoked signals at different frequencies in the EEG. So they included power spatial density to account for frequency domain information to further help distinguish between the texts being touched. This resulted in a superior overall information transfer rate – of about 0.7 bits/s.

But other experts believe that the results are not as significant as they may seem – nor as compelling as the study suggests.

Anton Nijholt, a professor at the University of Twente, The Netherlands, finds the paper and the research very interesting but feels that the text recognition rates are disappointing. “Unfortunately the paper has many claims that are not realistic. For instance, the recognition rates are not convincing to allow practical applications. In that context, the results are [still] limited,” he told The Wire.

His other gripe is that Bose and his team used too few objects in the study, leaving no indication of how these results could be generalised.

So, Nijholt says, to say that the technology works would be wishful thinking.

Bose thinks that the concerns raised by Nijholt are reasonable, particularly that the recognition rates are low – “but in machine learning, realtime performance is usually poorer than offline analysis.” Because this can affect practical applications, Bose is now looking for ways to improve this factor.

The researchers also don’t intend to extrapolate their results to other 3D characters. “We studied a limited set of text only because it was the first study of its kind,” according to Bose. Since they were able to achieve a higher accuracy in offline predictions, they want to repeat the experiment with all 26 letters of the alphabet, 10 digits of the number system and a variety of symbols. Bose believes that if he can make it work, it could bring visually challenged people on par with healthy individuals.

For those who develop a disability later in life, learning Braille can be tedious. By understanding how we perceive touch, he plans to make a device like a tablet of sorts that could be used by such people for reading and connecting with the computer. “The findings of the study could also be used to create communicative BCI for people who suffer from motor dysfunctions where there is restricted limb movement,” Bose says.

The paper was published in the journal Cognitive Neurodynamics on September 6, 2017.

Sarah Iqbal is a senior research fellow at the department of biochemistry, Aligarh Muslim University, India.

Scroll To Top