Beyond A*: Better Planning with Transformers

The author explores how Transformers can be trained to solve complex planning tasks, introducing Searchformer, a model that successfully tackles Sokoban puzzles with fewer search steps than traditional planners. Through expert iterations and predicting $A^*$ search dynamics, Searchformer outperforms baselines in maze navigation tasks, demonstrating scalability to larger and more complex decision-making challenges. This innovative approach challenges the dominance of symbolic planners in certain scenarios.

https://arxiv.org/abs/2402.14083

To top