DeepSeek Coder: Let the Code Write Itself

DeepSeek Coder is a collection of code language models that have been meticulously trained using a combination of code and natural language data. The models have been pre-trained on a massive amount of tokens, ranging from 2 billion to a whopping 33 billion. They have also been fine-tuned with additional instruction data to enhance their performance. This extensive training has resulted in DeepSeek Coder being able to support over 80 programming languages and offer various model sizes to cater to different needs. With a window size of 16K, the models can efficiently complete and infill code at the project level. DeepSeek Coder boasts state-of-the-art performance and is available as an open source tool for both research and commercial purposes.

https://deepseekcoder.github.io/

To top