×
Brain-inspired AI cuts energy use by 90% while boosting performance
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

University of Surrey researchers have developed a brain-inspired approach to artificial intelligence that dramatically improves performance while cutting energy consumption. The breakthrough, called Topographical Sparse Mapping, mimics how the human brain organizes neural connections, offering a more sustainable path forward as AI models continue growing in size and energy demands.

How it works: The new method connects each artificial neuron only to nearby or related neurons, mirroring the brain’s efficient information organization rather than creating vast networks of unnecessary connections.

  • Researchers published their findings in Neurocomputing, demonstrating that this approach significantly improves performance in generative AI and modern models like ChatGPT.
  • An enhanced version introduces a biologically inspired “pruning” process during training, similar to how the brain refines its neural connections as it learns.
  • The technique eliminates redundant connections while maintaining accuracy, creating more efficient AI systems.

Why this matters: Current AI training methods consume enormous amounts of energy, making sustainability a critical concern as the technology scales.

  • “Training many of today’s popular large AI models can consume over a million kilowatt-hours of electricity,” said Dr. Roman Bauer, senior lecturer at the University of Surrey.
  • “That simply isn’t sustainable at the rate AI continues to grow.”

The big picture: This brain-mimicking approach could revolutionize how AI systems are built, moving away from brute-force computational methods toward more elegant, biologically inspired solutions.

  • The research team is exploring applications in neuromorphic computers, a computing approach directly inspired by the human brain’s structure and function.
  • Dr. Bauer emphasized the broader implications: “Our work shows that intelligent systems can be built far more efficiently, cutting energy demands without sacrificing performance.”

What’s next: Researchers are investigating how Topographical Sparse Mapping could be applied to other AI applications and more realistic neuromorphic computing systems.

University of Surrey researchers mimic brain wiring to improve AI

Recent News

6 places where Google’s Gemini AI should be but isn’t

Despite impressive expansion, Gemini misses crucial opportunities where users need AI assistance most.

How to protect your portfolio from a potential AI bubble burst

Even AI champions like Altman and Zuckerberg are whispering about bubble risks.