MIT's new chip makes neural networks practical for battery-powered devices

MIT's new chip makes neural networks practical for battery-powered devices

MIT's new chip makes neural networks practical for battery-powered devices

Researchers from the MIT University have developed a special goal chip that boosts the speed of neural network computations by 3-7%.

While we can access the power of neural networks through our phones, it is only because the data generated is uploaded to a distant server and sent back again, saving your phone from the computational stress it would cause.

Neural network processing and AI workloads are both hot topics these days, driving multiple companies to announce their own custom silicon designs or to plug their own hardware as a top-end solution for these workloads.

"But the computation these algorithms do can be simplified to one specific operation, called the dot product", lead chip developer Biswas said in a statement.

Olympics-Alpine skiing-Czech shredder Ledecka takes shock super-G gold
After a setback in the super-G, her last chance for redemption at the Winter Games will be in the downhill next week. The super-G is still ongoing and this story will be updated with final positions when the event is complete.

A typical neural network is organized into layers.

That's because, in the chip, a node's input values are converted into electrical voltages and then multiplied by the appropriate weights.

The team's prototype is capable of calculating dot products for up to 16 nodes at a time instead of having to shuffle between a processor and a bank of memory for every computation. By storing all of its weights as either 1 or -1, the system can be implemented as a simple set of switches, while only losing 2-3 percent of accuracy compared with the vastly more expensive neural nets.

A neural net is an abstraction: The "nodes" are just weights stored in a computer's memory.

Still hope for Hydro-Quebec's $10B Northern Pass project
Central Maine Power says it expects to receive state approvals later this year and final federal permits in early 2019. Still, Eversource said in a statement it sees this latest MA decision as a possible path forward.

Without wading into the complicated science behind the technology, think about how the brain works: signals travel along multiple neurons to meet at a "synapse", or a gap between bundles of neurons.

Essentially, Biswas' chip replicates the brain more faithfully than previous neural network chips. Only the combined voltages are converted back into digital representation and stored for processing.

One of the keys to the system is that all the weights are either 1 or -1.

"This is a promising real-world demonstration of SRAM-based in-memory analog computing for deep-learning applications", said Dario Gil, vice president of artificial intelligence at IBM, in an MIT press release. "The results show impressive specifications for the energy-efficient implementation of convolution operations with memory arrays. It certainly will open the possibility to employ more complex convolutional neural networks for image and video classifications in IoT [the internet of things] in the future".

One down, more to come for Shiffrin
Won't be easy to forget what happened Friday, though. "I'll be doing the combined, so I am looking forward to that". She also said of finishing ahead of Shiffrin, "This is unbelievable , because Mikaela is such a talented skier".

Últimas notícias