Introducing Refact LLM, a code model with real-time code completion and chat capabilities. This model, with 1.6 billion parameters, achieves top performance among code LLMs and surpasses other code models like StableCode and CodeGen on the HumanEval metric. It is also 10 times smaller in size compared to similar models like StarCoder. Refact LLM was trained on a combination of permissive licensed code and open text datasets, and it is available for commercial use under the BigScience OpenRAIL-M license. The model can be easily integrated into developers’ workflows and requires just 3GB RAM to run on most modern GPUs.
https://refact.ai/blog/2023/introducing-refact-code-llm/