2024 Physics Nobel Prize Resources from AIP
The physics prize was announced Tuesday, Oct. 8, at 5:45 a.m. ET. The prize was awarded to John J. Hopfield and Geoffrey E. Hinton “for foundational discoveries and inventions that enable machine learning with artificial neural networks.”
This resource article will be populated with the information about the 2024 prize and the newly named laureates. You can expect to find:
- Overview of the prize
- Quotes from AIP leadership and possibly others
- Biographies and original illustrations
- Articles from AIP Publishing by and about these Nobel laureates (free access)
- Physics Today related content (free access)
- Key resources from the AIP Member Societies
- AIP Press releases
The AIP team will update the collection as information, assets, and resources are uncovered concerning the winning science.
Overview of the prize
While computers are powerful, they traditionally struggle with some tasks, such as pattern recognition, that humans and other mammals excel at. Our brains are collections of neurons, linked together into dynamic networks with connections of variable strength, that can rapidly identify patterns in data and learn so that those patterns can be recalled. Hopfield and Hinton created similar networks, termed “artificial neural networks,” that led to many of the exciting computational developments of the last few years.
In 1982, Hopfield defined one of the first instances of an artificial neural network. Named the Hopfield network, his design leverages techniques from statistical mechanics to create a form of associative memory. When a Hopfield network is exposed to a stimulus, pairs of neurons fire simultaneously, strengthening the connection between them, in a manner analogous to how connections form between biological neurons. Subsequently, this neural network can rely on these strengthened connections to recognize the initial stimulus, even from incomplete or noisy data. It “remembers” the pattern, so to speak.
Hinton expanded upon the Hopfield network and the application of physics-based concepts through the development of the Boltzmann Machine, a type of unsupervised generative deep learning network capable of identifying distinctive elements within data. By training it on examples that are likely to arise when the machine is run, the Boltzmann machine can classify images and generate new instances of learned patterns. Hinton’s work has been instrumental in sparking the rapid advancements in machine learning we see today, decades after his formative work.
AIP and AIP Publishing leadership
Beyond recognizing the laureates’ inspirations from condensed-matter physics and statistical mechanics, the prize celebrates interdisciplinarity. At its core, this prize is about how elements of physics have driven the development of computational algorithms to mimic biological learning, impacting how we make discoveries today across STEM. And it also demonstrates that fundamental shifts in our scientific understanding can sometimes take decades to have wider impact.
AIP Publishing congratulates John Hopfield and Geoffrey Hinton for the Nobel Prize in physics. Their research in understanding and developing artificial neural networks is a testament to the power of interdisciplinary research, combining fundamental concepts in statistical and quantum physics with neuroscience and psychology. This foundational discovery has led to an explosion of applications in machine learning and artificial intelligence in fields ranging from materials science to medical imaging.
Machine learning has had profound and wide-ranging impacts on many fields of chemical physics, from electronic structure theory to materials modelling and biomolecular simulation. In all these areas, it has enabled the simulation and understanding of much more complex systems than were accessible just a few years ago. We are delighted that this year’s Nobel Prize in physics has recognized the importance of the machine learning revolution, and we will continue to welcome papers on machine learning in the decades to come.
This year’s Nobel Prize in physics awarded to a former Caltech professor (now at Princeton), John Hopfield, and a U of Toronto professor, Geoffrey Hinton, for their foundational work on developing the first artificial neural networks and serves as a testament to the widespread impact of machine learning and artificial intelligence as a whole. Their work paved the way for AI first entering and now transforming our society, as Siri, Alexa, HeyGoogle, and ChatGPT have become household terms. The impact of their work on physics and chemistry is tremendous in being able to predict arrangements and shapes of things from atomic and molecular level to the long-standing problem of protein formation and folding to stars, exoplanets, and black holes in astrophysics.
This year’s Nobel Prize in physics showcases how machine learning, at its core, owes much to physics. Physics didn’t just offer inspiration; it provided the essential mathematical and conceptual tools that built the foundation for machine learning. Now, in an interesting twist, machine learning is powering developments across disciplines — including physics. This interplay isn’t just intriguing; it’s pivotal. I believe it will fuel fundamental breakthroughs in the future. That’s exactly the synergy we aim to capture in APL Machine Learning — the reciprocal relationship between “Applied Physics for Machine Learning” and “Machine Learning for Applied Physics.”
Today’s Nobel prize is the apotheosis of multidisciplinary and truly recognizes the impact AI has in multiple subfields of physics – from DFT to microscopy.
Biographies
John Hopfield was born in 1933 in Chicago. He earned his Ph.D. in physics from Cornell University in 1958, after which he worked as part of the Theoretical Physics Group at Bell Laboratories from 1958 to 1960. Following a brief period as a visiting research physicist at the École Normale Supérieure, Hopfield joined the faculty at the University of California, Berkeley, before moving to Princeton University. There, he served as a professor of physics from 1964 to 1980.
From 1973 to 1984, Hopfield resumed his affiliation with Bell Labs to work in the Molecular Biophysics Group, where his research increasingly merged biology with physics. He joined the California Institute of Technology in 1980 as a professor of chemistry and biology, a position he held until 1997. During his time at Caltech, in 1982, he introduced the “Hopfield network,” a groundbreaking model in the field of neural networks that has had a lasting influence on artificial intelligence and cognitive science. His research merged concepts from physics and biology, creating models of associative memory and neural processing that remain foundational in computational neuroscience. In 1997, he returned to Princeton University as a professor of molecular biology and remained in that role until his retirement.
Hopfield’s work has been recognized with numerous honors, including the Oliver E. Buckley Prize in 1969, the Max Delbruck Prize in 1985, and the Boltzmann Medal in 2022 for his contributions to computational neuroscience and the intersection of physics and biology. He was president of the American Physical Society for 2006.
Geoffrey Hinton was born in London in 1947 and received his bachelor’s degree in experimental psychology from Cambridge University in 1970. He worked briefly as a carpenter before pursuing doctoral work at the University of Edinburgh, then the only university in the U.K. to offer a postgraduate program in the subject. Writing his thesis on neural networks, he was awarded his Ph.D. in 1978.
Hinton landed visiting positions at Sussex University, the University of California, San Diego, and Cambridge, and his career soon gravitated to the U.S., where funding for AI work was more plentiful. He took on a faculty position in the computer science department at Carnegie-Mellon University in 1982. Between 1983 and 1985, he published papers with colleagues describing the “Boltzmann machine” AI model.
In 1987, Hinton moved to Canada, becoming a fellow of the Canadian Institute for Advanced Research and a member of the computer science department at the University of Toronto, where he has remained. From 2004 to 2013, he was the director of the “Neural Computation and Adaptive Perception” program funded by the Canadian Institute for Advanced Research. In 2012, Google purchased an AI company he started with students for $44 million, and after that he worked part-time at the company, becoming an engineering fellow and vice president.
Hinton has long been concerned about the social ramifications of AI — his move to Canada was driven by concerns about dependence on Defense Department funding of the subject. Last year, amid the new flurry of activity around AI, he left Google, saying he wanted the independence to speak freely about dangers that AI poses to society. Hinton was named a Companion of the Order of Canada, the country’s highest honor, in 2018, and in 2019 won the ACM A. M. Turing Award with his collaborators Yoshua Bengio and Yann LeCun.
https://www.nobelprize.org/uploads/2024/09/advanced-physicsprize2024.pdf
https://pni.princeton.edu/people/john-j-hopfield/cv
https://www.cs.toronto.edu/~hinton/bio.html
https://www.cs.toronto.edu/~hinton/fullcv2024.pdf
https://time.com/collection/time100-ai/6309026/geoffrey-hinton/
https://www.nytimes.com/2023/05/01/technology/ai-google-chatbot-engineer-quits-hinton.html
https://www.wired.com/story/secret-auction-race-ai-supremacy-google-microsoft-baidu/
AIP Publishing
Form Follows Function
John J Hopfield
Physics Today 55 (11), 10-11 (2002)
https://doi.org/10.1063/1.1534990
G-maximization: an unsupervised learning procedure for discovering regularities
Barak A. Pearlmutter; Geoffrey E Hinton
AIP Conference Proceedings 151, 333-338 (1986)
https://doi.org/10.1063/1.36234
Physics Today
- Leaders in artificial neural network development share 2024 Nobel Prize in Physics
- Neurons, dynamics, and computation , by Hopfield (February 1994)
- Machine learning meets quantum physics (March 2019)
- Spin glass VII: Spin glass as paradigm , by Philip Anderson (March 1990)
- Richard Feynman and the connection machine (February 1989)
- Statistical mechanics of neural networks (December 1988)
- Exploiting highly concurrent computers for physics (October 1987)
Niels Bohr Library & Archives
Video of John J. Hopfield’s talk, “Collective Properties of Neuronal Networks ,” at the 1983 Meeting of the Corporate Associates of the American Institute of Physics, held at the Xerox Palo Alto Research Center. This video forms a part of the Niels Bohr Library & Archives Collections. Please contact nbl@aip.org if you wish to use or quote. Catalog record .
Mentions of John Hopfield in the Oral History Collection:
- Oral history interview with Carver Mead, 2020 June 29, July 5, July 19, July 26, Aug. 2, Aug. 9, Aug. 16
- Oral history interview with Steven Girvin, 2020 July 2
- Oral history interview with Thomas Witten, 2020 September 18
- Oral history interview with William Bialek, 2020 Aug. 25, Oct. 8, Oct. 16, Oct. 23, Oct. 28, Nov. 10, Dec. 2
Member Societies
American Physical Society: American Physical Society congratulates winners of the 2024 Nobel Prize in Physics
Optica: Nobel Physics Prize Honors Roots of Modern AI
###
For more information contact:
AIP Media Line
301-209-3090
media@aip.org