News & Analysis
/
Article

Addressing the memory bottleneck with in-memory computing

FEB 17, 2023
Performing operations within the memory unit could speed up computing time and benefit select applications.
Addressing the memory bottleneck with in-memory computing internal name

Addressing the memory bottleneck with in-memory computing lead image

Traditional computers are built on von Neumann architecture, where data are stored in a memory unit separate from the computing core. While this design allows for specialized components for each function, it also results in a bottleneck that worsens as technology advances. To run a program, the computing core must first access the data and instructions from the memory. Computing time is therefore limited by transmission time to and from the memory and undercuts many advances in computing speed.

Mannocci et al. described recent advances in the field of in-memory computing, which is one promising solution to the memory bottleneck. They discussed the status of the field and outlined existing challenges and directions for future research.

“As a researcher, I know how important it is to read detailed reviews of the state of the art,” said author Daniele Ielmini. “My goal was to serve the research community with a detailed report and possibly attract more researchers to work on this exciting topic.”

In-memory computing seeks to solve the memory bottleneck by moving some computing work, including simple operations such as vector multiplication and addition as well as advanced tasks like matrix inversion and extraction of eigenvectors, to the memory unit. This could lead to the development of more energy-efficient machine learning systems. However, incorporating this functionality will require a redesign of existing architecture and more concrete ideas about how to implement it.

“An important research direction is on the application side, to identify the applications that might best benefit from in-memory computing in terms of acceleration, energy consumption, and cost,” said Ielmini. “Machine learning and deep learning are the areas that might benefit most.”

Source: “In-memory computing with emerging memory devices: status and outlook,” by Piergiulio Mannocci, Matteo Farronato, Nicola Lepri, Lorenzo Cattaneo, Artem Glukhov, Zhong Sun, and Daniele Ielmini, APL Machine Learning (2023). The article can be accessed at https://doi.org/10.1063/5.0136403 .

Related Topics
More Science
/
Article
Mars’ crustal magnetic field plays a big role in how ions move through its atmosphere, and understanding its magnetism has wide-ranging implications.
APS
/
Article
Researchers have turned NASA’s Parker Solar Probe into a dark-matter detector, taking advantage of its close encounters with the Sun to search for dark-photon signals.
AAS
/
Article
Lenses, electronics, and many other telescope parts are made in Asia, putting even U.S.-based manufacturers in a bind.
/
Article
Now that the physics of ignition has been proven, attention is shifted to efficiency and optimization.