Analyzing The Strengths Of Brain Learning vs. Artificial Intelligence

Are humans becoming obsolete in the age of AI? The question has sparked debates and conversations among influential thinkers, scientists, and politicians. AI is advancing rapidly, soon to surpass human intellect in many areas. What does this mean for the future?

Is brain learning weaker than artificial intelligence, or are they specialized tools with different strengths and weaknesses? Here we dig deeper into these questions to assess if AI truly holds an advantage over human intellect.

Efficient DL wiring structures (architectures) often consist of tens of consecutive feedforward layers, whereas brain dynamics are limited to a few feedforward layers.

Deep Learning (DL) architectures are known to rely on multiple consecutive layers of filters, which are indispensable in deeming the distinct input classes recognized. Thus, these filter layers are fundamental in detecting properties inherent to an established class.

The input of a car is filtered through various stages; the first identifies its wheels, then its doors, followed by lights, and so on until it can ultimately be recognized as a car.

A filter near the retina comprises a key component of the brain’s dynamics. A complex DL training procedure is necessary for this dynamic to function properly- far beyond biological capabilities.

The brain’s limited capabilities in precise mathematical operations raise the question: can it still be seen as a viable alternative to modern AI systems running on fast and parallel computers?

Daily, we often observe considerable success in performing many tasks. Based on this idea, can one construct an effective artificial intelligence – inspired by how the brain functions – that can achieve similar outcomes efficiently? The answer a large majority agrees to be “yes.”

Prof. Ido Kanter from Bar-Ilan University’s Department of Physics and Gonda (Goldschmied) Multidisciplinary Brain Research Center published an article today in Scientific Reports detailing how his team’s research solved a much-contested puzzle.

Prof. Ido Kanter says:

“We’ve shown that efficient learning on an artificial tree architecture, where each weight has a single route to an output unit, can achieve better classification success rates than previously achieved by DL architectures consisting of more layers and filters. This finding paves the way for efficient, biologically-inspired new AI hardware and algorithms.”

Yuval Meir is a Ph.D. student who has contributed substantially to this work. His research and collaboration have been essential to the progress made toward bringing this project to fruition.

Research by Dr. Roni Vardi, building on experiments conducted by Kanter and his team, supports the notion of sub-dendritic adaptation by exemplifying the features neurons can exhibit, such as anisotropies including differing spike waveforms, refractory periods, and maximal transmission rates.

The rapid development of highly pruned tree training techniques necessitates designing a new type of hardware distinct from modern GPUs, the current go-to device for Deep Learning (DL) applications. This new machinery should be better suited to simulating brain mechanics with maximum efficiency.

But even as AI progresses rapidly, it’s important to remember that the brain is still the most powerful learning machine. With centuries more experience, the human brain isn’t going anywhere anytime soon.

Source: neurosciencenews.com

 

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top