With the Loihi 2 neuromorphic chip, machines can perform application processing, problem-solving, adaptation,and learning much faster than before. Four years after Intel first introduced Loihi, the company’s first neuromorphic chip, the company has released its second generation processor, which Intel says will provide faster processing, greater resource density, and improved power efficiency. CPUs are often called the brains of the computer but aren’t, really, since they process only a handful of tasks at once in a serial manner, nothing like what the brain does automatically to keep you alive. Neuromorphic computing attempts to replicate the functions of the brain by performing numerous tasks simultaneously, with emphasis on perception and decision making Neuromorphic chips mimic neurological functions through computational “neurons” that communicate with one another. The first generation of Loihi chips had around 128,000 of those digital neurons; the Loihi 2 has more than a million. Intel states that early tests of Loihi 2 required more than 60 times fewer ops per inference when running deep neural networks compared to Loihi 1, without a loss in accuracy. This can mean real-time application processing, problem-solving, adaptation and learning. It has even learned how to smell. Loihi 2 also features faster I/O interfaces to support Ethernet connections with vision-based sensors and larger meshed networks. This will help the chip better integrate with the robotics and sensors that have been commonly used Loihi 1 in the past. Loihi isn’t sold like regular Intel chips. It is sold through complete systems to select members of its Intel Neuromorphic Research Community (INRC). Those systems are called Oheo Gulch, which uses a single Loihi 2 chip and is intended for early evaluation, and Kapoho Point, which offers eight Loihi 2 chips and will be available soon. Intel Releases Lava Framework To support development for neuromorphic applications, Intel has also introduced an open, modular, and extensible software framework known as “Lava”, which the company says provides the neuromorphic computing community with a common development framework. A component of Lava is Magma, an interface for mapping and executing neural-network models and other processes using neuromorphic hardware. Lava also includes offline training, integration with third-party frameworks, Python interfaces and more. The Lava framework is available now on GitHub. Related content news Supermicro unveils AI-optimized storage powered by Nvidia New storage system features multiple Nvidia GPUs for high-speed throughput. By Andy Patrizio Oct 24, 2024 3 mins Enterprise Storage Data Center news Nvidia to power India’s AI factories with tens of thousands of AI chips India’s cloud providers and server manufacturers plan to boost Nvidia GPU deployment nearly tenfold by the year’s end compared to 18 months ago. By Prasanth Aby Thomas Oct 24, 2024 5 mins GPUs Artificial Intelligence Data Center news Gartner: 13 AI insights for enterprise IT Costs, security, management and employee impact are among the core AI challenges that enterprises face. By Michael Cooney Oct 23, 2024 6 mins Generative AI Careers Data Center news Network jobs watch: Hiring, skills and certification trends What IT leaders need to know about expanding responsibilities, new titles and hot skills for network professionals and I&O teams. By Denise Dubie Oct 23, 2024 33 mins Careers Data Center Networking PODCASTS VIDEOS RESOURCES EVENTS NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe