Главная
Study mode:
on
1
Introduction
2
Welcome
3
PhysicsInspired Computing
4
Pbits
5
Magnetic Tunnel Junction
6
probabilistic computer
7
variability concerns
8
machine learning
9
summary
10
QA
11
Tanner
12
Thomas
13
Scaling
14
Removing the digital intermediary
15
Mixed signal circuit
16
Analog noise
17
Better MTJs
18
Memory
19
Algorithms
20
Keele divergence
21
Mapping algorithms
22
More questions
23
Energy savings
Description:
Explore the potential of spintronics in hardware neural networks for fast and energy-efficient machine learning through this 59-minute APS Physics journal club presentation. Delve into the challenges of device-to-device variations in large-scale neuromorphic systems and discover how in situ learning of weights and biases in a Boltzmann machine can address these issues. Learn about a scalable, autonomously operating learning circuit using spintronics-based neurons, designed for standalone AI devices capable of efficient edge learning. Join Jan Kaiser from Purdue University as he discusses his team's recent study published in Physical Review Applied, demonstrating the ability to counter variability and learn probability distributions for meaningful operations like a full adder. Gain insights into physics-inspired computing, probabilistic bits (pbits), magnetic tunnel junctions, and the potential for energy savings in this cutting-edge field. The presentation is followed by a Q&A session moderated by Dr. Matthew Daniels from NIST-Gaithersburg, covering topics such as scaling, mixed-signal circuits, analog noise, improved magnetic tunnel junctions, memory considerations, and algorithm mapping. Read more

Machine Learning Based on Stochastic Magnetic Tunnel Junctions

APS Physics
Add to list