作者
Gennady Livitz, Massimiliano Versace, Anatoli Gorchetchnikov, Z Vasilkoski, H Ames, Ben Chandler, J Leveille, E Mingolla
发表日期
2011/2
期刊
The Neuromorphic Engineer
卷号
10
期号
1201101.003500
页码范围
3
简介
Despite recent advances in computational power and memory capacity, realizing brain functions that allow for perception, cognition, and learning on biological temporal and spatial scales remains out of reach for even the fastest computers. By contrast, these functions are easily achieved by mammalian brains. For example, a rodent placed in a water pool can find its way to a submerged platform using visual cues to self-localize its position and reach a learned safe location. Even a best-case extrapolation for implementing such behavior at a functional level using an artificial brain based on conventional technology would consume several orders of magnitude more power and space than its biological counterpart. Clearly, the computational principles employed by a mammalian brain are radically different from those used by today’s computers. Classical implementations of large-scale neural systems in computers use resources such as central processing unit (CPU) and graphics processing unit (GPU) cores, mass memory storage, and parallelization algorithms. Designs for such systems must cope with power dissipation from data transmission between processing and memory units. By some estimates, this loss is millions of times the power required to actually compute, in the sense of creating meaningful new register contents. Such a high transmission loss is unavoidable as long as memory and computation are physically distant. The creation of an electronic brain stuffed into the volume of a mammalian brain is thus impossible via conventional technology. The Defense Advanced Research Projects Agency (DARPA)-sponsored Systems of …
引用总数
2017201820192020202120222023202412111
学术搜索中的文章
G Livitz, M Versace, A Gorchetchnikov, Z Vasilkoski… - The Neuromorphic Engineer, 2011