In the United States, researchers have discovered a new approach to making neural networks used by artificial intelligence more efficient. By aggregating working memory across the network, synapses can be updated in real time, significantly limiting the number of data processing operations.
Artificial intelligence (AI) is very powerful, especially in its training phase. This is such a big problemElon MuskElon Musk It predicts that the entire world will run out of electricity for AI within a year. But researchers from Cold Spring Harbor Laboratory Perhaps the solution was found in the United States Me the doMe the do human being
“They are impressive ChatGPTChatGPT And all these current AI technologies, in terms of interaction with the world physicalphysical, they are still very limited. Even in areas like solving math problems and writing essays, they have to practice billions and billions of examples before they can do them well. “, explained Kyle Daruwalla, lead author of the article published in the journal Frontiers in Computational Neuroscience.
Working memory that allows synapses to update
For him, the problem lies in data movement. Artificial neural networks are made up of billions of connections, and data must pass through all the circuits before the model updates. With this new approach, each artificial neuron receives feedback from an auxiliary memory network and is modified in real time.
It creates a direct relationship between Working memoryWorking memory And provides new evidence for an unproven theory in neuroscience that links synaptic updates, working memory, to learning and academic performance. So this new approach significantly reduces computing power and thuspowerpower Essential to the functioning of artificial intelligence.
More Stories
Russia imposes fines on Google that exceed company value
Historic decline in travel in Greater Montreal
Punches on the “Make America Great Again” cap: Two passengers kicked off the plane