DBRX: A new open LLM

Introducing DBRX, a cutting-edge open LLM by Databricks, surpassing GPT-3.5 in performance. It excels in programming and mathematics, outperforming specialized models like CodeLLaMA-70B. DBRX’s fine-grained MoE architecture supports faster inference and training, making it 40% smaller than Grok-1. With improved efficiency, it matches previous-generation MPT models with 4x less compute. DBRX is available for Databricks customers, outshining GPT-3.5 Turbo and challenging GPT-4 Turbo in GenAI-powered products. Its unique features include 16 experts choosing 4 for enhanced model quality compared to other MoE models. DBRX sets a new standard for open LLMs, providing unmatched training efficiency and performance.

https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm

To top