Could a computer think like the human brain? A study done at Khalifa University, in the United Arab Emirates, intends to prove that Artificial Intelligence systems can come close to our ability to reason.
The researchers analyzed several models of hyperdimensional computing, known as HDC, which use large vectors, approximately 10,000 bits each, and are inspired by patterns that compare to the neural activity found in the human brain.
The development of an HDC architecture involves several stages that can be divided into coding, training and comparison. This system can detect faults and noise in predetermined patterns, just like a person’s central nervous system.
How the HDC “thinks”
Humans are able to identify that a table without one leg remains a table. A conventional Artificial Intelligence system, on the other hand, usually looks at this three-legged table and classifies it as something new, ceasing to be essentially a table.
HDC vectors can understand this distortion and even if part of an object is missing, the system is able to recognize it as a whole, using data about the original appearance stored in its memory.
“In an HDC vector, we can represent the data holistically, which means that the value of an object is distributed among many data points. Therefore, we can reconstruct the meaning of the vector, as long as we have 60% of its content “, says Professor Eman Hassan.
HDC x Convolutional Networks
Scientists have always been inspired by the human brain to develop more efficient computer systems. In one of these attempts, convolutional neural networks were created, deep learning algorithms that can capture several input images and assign distinct importance on some aspects to differentiate one from the other.
The problem with convolutional networks is that they require a large amount of data and prior training to perform satisfactorily. HDC vectors, on the other hand, are capable of tolerating errors, with memory-centered processing to perform complex calculations with less computational power.
“Hyper-dimensional computing is a promising model for high-end devices, as it does not include a demanding training stage like that found in the convolutional neural network, widely used in conventional applications”, says Professor Hassan.
Future in vectors
Hyperdimensional computing can represent an advance in the development of new technologies based on Artificial Intelligence, capable of using memory elements to reduce data processing.
With the increase in demand for smart devices, including autonomous vehicles, there is also a need for systems in which computing is performed in real time, without depending on data processing done in the cloud or in data centers located miles away.
Preliminary data show that HDC vectors can overcome digital neural networks in speech recognition applications, which have a one-dimensional data set. However, in two-dimensional applications, data coding can consume about 80% of the time used in your training, which still makes the technology unfeasible.
“HDC has shown promising results for one-dimensional applications, using less energy and with less latency than the latest generation deep neural networks. But in 2D applications, convolutional neural networks still achieve greater classification accuracy, but at the cost of further calculations “, adds Professor Hassan.