Boldly go where Gradient Descent has never gone before with DiscoGrad

DiscoGrad is a tool for combining automatic differentiation (AD) with C++ programs involving branching control flow and randomness to efficiently calculate gradients for optimization, control, and inference. The tool supports smoothing gradients across branches using external perturbations and allows integration with neural networks through Torch. DiscoGrad includes different gradient estimation backends like DiscoGrad Gradient Oracle (DGO) for better optimization compared to traditional AD. The tool is a research prototype and includes sample applications from various domains. The compilation process and usage of the DiscoGrad API are outlined, providing a seamless way to transform, compile, and run programs with different backends. DiscoGrad is licensed under MIT License.

https://github.com/DiscoGrad/DiscoGrad

To top