Chinese scientists have introduced a groundbreaking brain-inspired network model designed to tackle the challenges of high computing resource consumption in traditional artificial intelligence (AI). The innovative approach was unveiled by the Institute of Automation under the Chinese Academy of Sciences and published in the journal Nature Computational Science on Friday.
Unlike the prevalent method of scaling up neural networks to achieve general intelligence—an approach known as “external complexity”—this new model focuses on “internal complexity.” Researcher Li Guoqi explained that building larger and more complex neural networks consumes vast amounts of energy and computational power while lacking interpretability. In contrast, the human brain operates efficiently on just 20 watts of power, despite containing approximately 100 billion neurons and 1,000 trillion synaptic connections.
Drawing inspiration from the brain’s internal dynamics, scientists from the Institute of Automation, in collaboration with Tsinghua University and Peking University, applied the internal complexity approach to AI development. Their experiments demonstrated the model’s effectiveness in managing complex tasks, offering a novel method for integrating neuroscience principles into AI and optimizing performance.
This advancement signifies a potential paradigm shift in AI research, emphasizing energy efficiency and interpretability. By mimicking the brain’s intricate internal mechanisms, the model could lead to AI systems that are not only more powerful but also more sustainable and understandable.
(With input from Xinhua)
Reference(s):
Chinese scientists develop brain-inspired AI to cut energy demand
cgtn.com