Physical neural networks (PNNs) utilize physical properties to compute, showing potential to revolutionize AI. This niche area has shown promise in scaling AI models up to 1000x larger for local, private edge device inference. Current research explores backpropagation-based and backpropagation-free training methods impacting scalability and performance. While no method matches the widely used backpropagation algorithm in deep learning today, a diverse range of techniques hints at future possibilities for PNNs. The key lies in reimagining AI models from a hardware physics perspective. This innovative approach could lead to more efficient AI models and enable unprecedented scales in the future.
https://arxiv.org/abs/2406.03372