New AI model mimics human brain to boost efficiency
Researchers at the University of Surrey have developed a novel approach to enhance artificial intelligence (AI) performance by mimicking the human brain's neural networks.
Published in Neurocomputing, the study outlines how this brain-inspired model can improve the efficiency of AI systems, including generative models like ChatGPT, BBC writes.
The approach, known as Topographical Sparse Mapping, connects neurons only to nearby or related ones, similar to how the human brain organizes information. This method reduces unnecessary connections, streamlining the AI's structure and boosting its performance while lowering energy consumption.
Dr. Roman Bauer, senior lecturer at the University of Surrey, explained, "Our work shows that intelligent systems can be built far more efficiently, cutting energy demands without sacrificing performance."
Traditional AI models often require massive amounts of energy to train. Dr. Bauer pointed out, "Training many of today's popular large AI models can consume over a million kilowatt-hours of electricity. That simply isn't sustainable at the rate AI continues to grow."
The research team has also developed an enhanced version of the model called Enhanced Topographical Sparse Mapping, which introduces a biologically inspired "pruning" process during training. This process, similar to how the brain refines its neural connections over time, further improves the model’s efficiency.
Looking ahead, the researchers are exploring additional applications of this approach, including its potential in neuromorphic computing—systems that replicate the brain's structure and function for more realistic and efficient computing.
By Sabina Mammadli







