EXGBoost: Gradient Boosting in Elixir

In the past few years, Elixir has made significant strides in expanding its machine learning capabilities. The Nx project allows Elixir programmers to implement efficient numeric algorithms, making it suitable for implementing neural networks, traditional machine learning algorithms, and applications using ordinary differential equations. Among the projects that have emerged from the Elixir Nx efforts are Livebook, ElixirExplorer, Bumblebee, and VegaLite. However, one gap in the Elixir machine learning ecosystem was the lack of a library for implementing decision trees. This gap has now been filled with the introduction of EXGBoost, a library that provides bindings to the popular XGBoost library. Gradient boosting, the technique used by EXGBoost, is a powerful method for training interpretable machine learning models. It rivals deep learning for certain modalities and is particularly useful for tabular data. The addition of EXGBoost to the Elixir machine learning ecosystem is a significant development as it brings new capabilities and expands the possibilities for machine learning developers.


To top