Our brain may have 10 times more memory capacity than what is commonly estimated, suggests new research.
“This is a real bombshell in the field of neuroscience,” said co-senior author of the study Terry Sejnowski, professor at Salk Institute for Biological Studies in California, US.
“Our new measurements of the brain’s memory capacity increase conservative estimates by a factor of 10 to at least a petabyte, in the same ballpark as the World Wide Web,” Sejnowski noted.
The new research published in the journal eLife also answers a long standing question as to how the brain is so energy efficient and could help engineers build computers that are incredibly powerful but also conserve energy.
“We discovered the key to unlocking the design principle for how hippocampal neurons function with low energy but high computation power,” Sejnowski said.
Our memories and thoughts are the result of patterns of electrical and chemical activity in the brain.
A key part of the activity happens when branches of neurons, much like electrical wire, interact at certain junctions, known as synapses.
For their research, the team built a 3D reconstruction of rat hippocampus tissue (the memory centre of the brain).
“Our data suggests there are 10 times more discrete sizes of synapses than previously thought,” Tom Bartol, a scientist at Salk Institute, said.
The team determined there could be about 26 categories of sizes of synapses, rather than just a few.
In computer terms, 26 sizes of synapses correspond to about 4.7 “bits” of information. Previously, it was thought that the brain was capable of just one to two bits for short and long memory storage in the hippocampus, the researchers explained.
“The implications of what we found are far-reaching,” Sejnowski said.
“Hidden under the apparent chaos and messiness of the brain is an underlying precision to the size and shapes of synapses that was hidden from us,” Sejnowski noted.
“This trick of the brain absolutely points to a way to design better computers,” Sejnowski said.